Dec  1 04:09:00 np0005540826 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Dec  1 04:09:00 np0005540826 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  1 04:09:00 np0005540826 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540826 kernel: BIOS-provided physical RAM map:
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  1 04:09:00 np0005540826 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  1 04:09:00 np0005540826 kernel: NX (Execute Disable) protection: active
Dec  1 04:09:00 np0005540826 kernel: APIC: Static calls initialized
Dec  1 04:09:00 np0005540826 kernel: SMBIOS 2.8 present.
Dec  1 04:09:00 np0005540826 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  1 04:09:00 np0005540826 kernel: Hypervisor detected: KVM
Dec  1 04:09:00 np0005540826 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  1 04:09:00 np0005540826 kernel: kvm-clock: using sched offset of 3961128290 cycles
Dec  1 04:09:00 np0005540826 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  1 04:09:00 np0005540826 kernel: tsc: Detected 2800.000 MHz processor
Dec  1 04:09:00 np0005540826 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  1 04:09:00 np0005540826 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  1 04:09:00 np0005540826 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  1 04:09:00 np0005540826 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  1 04:09:00 np0005540826 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  1 04:09:00 np0005540826 kernel: Using GB pages for direct mapping
Dec  1 04:09:00 np0005540826 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Dec  1 04:09:00 np0005540826 kernel: ACPI: Early table checksum verification disabled
Dec  1 04:09:00 np0005540826 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  1 04:09:00 np0005540826 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540826 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540826 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540826 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  1 04:09:00 np0005540826 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540826 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  1 04:09:00 np0005540826 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  1 04:09:00 np0005540826 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  1 04:09:00 np0005540826 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  1 04:09:00 np0005540826 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  1 04:09:00 np0005540826 kernel: No NUMA configuration found
Dec  1 04:09:00 np0005540826 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540826 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  1 04:09:00 np0005540826 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  1 04:09:00 np0005540826 kernel: Zone ranges:
Dec  1 04:09:00 np0005540826 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  1 04:09:00 np0005540826 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  1 04:09:00 np0005540826 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540826 kernel:  Device   empty
Dec  1 04:09:00 np0005540826 kernel: Movable zone start for each node
Dec  1 04:09:00 np0005540826 kernel: Early memory node ranges
Dec  1 04:09:00 np0005540826 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  1 04:09:00 np0005540826 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  1 04:09:00 np0005540826 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540826 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  1 04:09:00 np0005540826 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  1 04:09:00 np0005540826 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  1 04:09:00 np0005540826 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  1 04:09:00 np0005540826 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  1 04:09:00 np0005540826 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  1 04:09:00 np0005540826 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  1 04:09:00 np0005540826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  1 04:09:00 np0005540826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  1 04:09:00 np0005540826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  1 04:09:00 np0005540826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  1 04:09:00 np0005540826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  1 04:09:00 np0005540826 kernel: TSC deadline timer available
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Max. logical packages:   8
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Max. logical dies:       8
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Max. dies per package:   1
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Max. threads per core:   1
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Num. cores per package:     1
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Num. threads per package:   1
Dec  1 04:09:00 np0005540826 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  1 04:09:00 np0005540826 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  1 04:09:00 np0005540826 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  1 04:09:00 np0005540826 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  1 04:09:00 np0005540826 kernel: Booting paravirtualized kernel on KVM
Dec  1 04:09:00 np0005540826 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  1 04:09:00 np0005540826 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  1 04:09:00 np0005540826 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  1 04:09:00 np0005540826 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  1 04:09:00 np0005540826 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540826 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Dec  1 04:09:00 np0005540826 kernel: random: crng init done
Dec  1 04:09:00 np0005540826 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: Fallback order for Node 0: 0 
Dec  1 04:09:00 np0005540826 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  1 04:09:00 np0005540826 kernel: Policy zone: Normal
Dec  1 04:09:00 np0005540826 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  1 04:09:00 np0005540826 kernel: software IO TLB: area num 8.
Dec  1 04:09:00 np0005540826 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  1 04:09:00 np0005540826 kernel: ftrace: allocating 49313 entries in 193 pages
Dec  1 04:09:00 np0005540826 kernel: ftrace: allocated 193 pages with 3 groups
Dec  1 04:09:00 np0005540826 kernel: Dynamic Preempt: voluntary
Dec  1 04:09:00 np0005540826 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  1 04:09:00 np0005540826 kernel: rcu: #011RCU event tracing is enabled.
Dec  1 04:09:00 np0005540826 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  1 04:09:00 np0005540826 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540826 kernel: #011Rude variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540826 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540826 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  1 04:09:00 np0005540826 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  1 04:09:00 np0005540826 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540826 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540826 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540826 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  1 04:09:00 np0005540826 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  1 04:09:00 np0005540826 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  1 04:09:00 np0005540826 kernel: Console: colour VGA+ 80x25
Dec  1 04:09:00 np0005540826 kernel: printk: console [ttyS0] enabled
Dec  1 04:09:00 np0005540826 kernel: ACPI: Core revision 20230331
Dec  1 04:09:00 np0005540826 kernel: APIC: Switch to symmetric I/O mode setup
Dec  1 04:09:00 np0005540826 kernel: x2apic enabled
Dec  1 04:09:00 np0005540826 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  1 04:09:00 np0005540826 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  1 04:09:00 np0005540826 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec  1 04:09:00 np0005540826 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  1 04:09:00 np0005540826 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  1 04:09:00 np0005540826 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  1 04:09:00 np0005540826 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  1 04:09:00 np0005540826 kernel: Spectre V2 : Mitigation: Retpolines
Dec  1 04:09:00 np0005540826 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  1 04:09:00 np0005540826 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  1 04:09:00 np0005540826 kernel: RETBleed: Mitigation: untrained return thunk
Dec  1 04:09:00 np0005540826 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  1 04:09:00 np0005540826 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  1 04:09:00 np0005540826 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  1 04:09:00 np0005540826 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  1 04:09:00 np0005540826 kernel: x86/bugs: return thunk changed
Dec  1 04:09:00 np0005540826 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  1 04:09:00 np0005540826 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  1 04:09:00 np0005540826 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  1 04:09:00 np0005540826 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  1 04:09:00 np0005540826 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  1 04:09:00 np0005540826 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  1 04:09:00 np0005540826 kernel: Freeing SMP alternatives memory: 40K
Dec  1 04:09:00 np0005540826 kernel: pid_max: default: 32768 minimum: 301
Dec  1 04:09:00 np0005540826 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  1 04:09:00 np0005540826 kernel: landlock: Up and running.
Dec  1 04:09:00 np0005540826 kernel: Yama: becoming mindful.
Dec  1 04:09:00 np0005540826 kernel: SELinux:  Initializing.
Dec  1 04:09:00 np0005540826 kernel: LSM support for eBPF active
Dec  1 04:09:00 np0005540826 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  1 04:09:00 np0005540826 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  1 04:09:00 np0005540826 kernel: ... version:                0
Dec  1 04:09:00 np0005540826 kernel: ... bit width:              48
Dec  1 04:09:00 np0005540826 kernel: ... generic registers:      6
Dec  1 04:09:00 np0005540826 kernel: ... value mask:             0000ffffffffffff
Dec  1 04:09:00 np0005540826 kernel: ... max period:             00007fffffffffff
Dec  1 04:09:00 np0005540826 kernel: ... fixed-purpose events:   0
Dec  1 04:09:00 np0005540826 kernel: ... event mask:             000000000000003f
Dec  1 04:09:00 np0005540826 kernel: signal: max sigframe size: 1776
Dec  1 04:09:00 np0005540826 kernel: rcu: Hierarchical SRCU implementation.
Dec  1 04:09:00 np0005540826 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  1 04:09:00 np0005540826 kernel: smp: Bringing up secondary CPUs ...
Dec  1 04:09:00 np0005540826 kernel: smpboot: x86: Booting SMP configuration:
Dec  1 04:09:00 np0005540826 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  1 04:09:00 np0005540826 kernel: smp: Brought up 1 node, 8 CPUs
Dec  1 04:09:00 np0005540826 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec  1 04:09:00 np0005540826 kernel: node 0 deferred pages initialised in 50ms
Dec  1 04:09:00 np0005540826 kernel: Memory: 7765812K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616268K reserved, 0K cma-reserved)
Dec  1 04:09:00 np0005540826 kernel: devtmpfs: initialized
Dec  1 04:09:00 np0005540826 kernel: x86/mm: Memory block size: 128MB
Dec  1 04:09:00 np0005540826 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  1 04:09:00 np0005540826 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: pinctrl core: initialized pinctrl subsystem
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  1 04:09:00 np0005540826 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  1 04:09:00 np0005540826 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  1 04:09:00 np0005540826 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  1 04:09:00 np0005540826 kernel: audit: initializing netlink subsys (disabled)
Dec  1 04:09:00 np0005540826 kernel: audit: type=2000 audit(1764580138.119:1): state=initialized audit_enabled=0 res=1
Dec  1 04:09:00 np0005540826 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  1 04:09:00 np0005540826 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  1 04:09:00 np0005540826 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  1 04:09:00 np0005540826 kernel: cpuidle: using governor menu
Dec  1 04:09:00 np0005540826 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  1 04:09:00 np0005540826 kernel: PCI: Using configuration type 1 for base access
Dec  1 04:09:00 np0005540826 kernel: PCI: Using configuration type 1 for extended access
Dec  1 04:09:00 np0005540826 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  1 04:09:00 np0005540826 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  1 04:09:00 np0005540826 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  1 04:09:00 np0005540826 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  1 04:09:00 np0005540826 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  1 04:09:00 np0005540826 kernel: Demotion targets for Node 0: null
Dec  1 04:09:00 np0005540826 kernel: cryptd: max_cpu_qlen set to 1000
Dec  1 04:09:00 np0005540826 kernel: ACPI: Added _OSI(Module Device)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Added _OSI(Processor Device)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  1 04:09:00 np0005540826 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  1 04:09:00 np0005540826 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  1 04:09:00 np0005540826 kernel: ACPI: Interpreter enabled
Dec  1 04:09:00 np0005540826 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  1 04:09:00 np0005540826 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  1 04:09:00 np0005540826 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  1 04:09:00 np0005540826 kernel: PCI: Using E820 reservations for host bridge windows
Dec  1 04:09:00 np0005540826 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  1 04:09:00 np0005540826 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [3] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [4] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [5] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [6] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [7] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [8] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [9] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [10] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [11] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [12] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [13] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [14] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [15] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [16] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [17] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [18] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [19] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [20] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [21] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [22] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [23] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [24] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [25] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [26] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [27] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [28] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [29] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [30] registered
Dec  1 04:09:00 np0005540826 kernel: acpiphp: Slot [31] registered
Dec  1 04:09:00 np0005540826 kernel: PCI host bridge to bus 0000:00
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  1 04:09:00 np0005540826 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  1 04:09:00 np0005540826 kernel: iommu: Default domain type: Translated
Dec  1 04:09:00 np0005540826 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  1 04:09:00 np0005540826 kernel: SCSI subsystem initialized
Dec  1 04:09:00 np0005540826 kernel: ACPI: bus type USB registered
Dec  1 04:09:00 np0005540826 kernel: usbcore: registered new interface driver usbfs
Dec  1 04:09:00 np0005540826 kernel: usbcore: registered new interface driver hub
Dec  1 04:09:00 np0005540826 kernel: usbcore: registered new device driver usb
Dec  1 04:09:00 np0005540826 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  1 04:09:00 np0005540826 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  1 04:09:00 np0005540826 kernel: PTP clock support registered
Dec  1 04:09:00 np0005540826 kernel: EDAC MC: Ver: 3.0.0
Dec  1 04:09:00 np0005540826 kernel: NetLabel: Initializing
Dec  1 04:09:00 np0005540826 kernel: NetLabel:  domain hash size = 128
Dec  1 04:09:00 np0005540826 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  1 04:09:00 np0005540826 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  1 04:09:00 np0005540826 kernel: PCI: Using ACPI for IRQ routing
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  1 04:09:00 np0005540826 kernel: vgaarb: loaded
Dec  1 04:09:00 np0005540826 kernel: clocksource: Switched to clocksource kvm-clock
Dec  1 04:09:00 np0005540826 kernel: VFS: Disk quotas dquot_6.6.0
Dec  1 04:09:00 np0005540826 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  1 04:09:00 np0005540826 kernel: pnp: PnP ACPI init
Dec  1 04:09:00 np0005540826 kernel: pnp: PnP ACPI: found 5 devices
Dec  1 04:09:00 np0005540826 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_INET protocol family
Dec  1 04:09:00 np0005540826 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  1 04:09:00 np0005540826 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_XDP protocol family
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  1 04:09:00 np0005540826 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  1 04:09:00 np0005540826 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  1 04:09:00 np0005540826 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 73061 usecs
Dec  1 04:09:00 np0005540826 kernel: PCI: CLS 0 bytes, default 64
Dec  1 04:09:00 np0005540826 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  1 04:09:00 np0005540826 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  1 04:09:00 np0005540826 kernel: ACPI: bus type thunderbolt registered
Dec  1 04:09:00 np0005540826 kernel: Trying to unpack rootfs image as initramfs...
Dec  1 04:09:00 np0005540826 kernel: Initialise system trusted keyrings
Dec  1 04:09:00 np0005540826 kernel: Key type blacklist registered
Dec  1 04:09:00 np0005540826 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  1 04:09:00 np0005540826 kernel: zbud: loaded
Dec  1 04:09:00 np0005540826 kernel: integrity: Platform Keyring initialized
Dec  1 04:09:00 np0005540826 kernel: integrity: Machine keyring initialized
Dec  1 04:09:00 np0005540826 kernel: Freeing initrd memory: 85868K
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_ALG protocol family
Dec  1 04:09:00 np0005540826 kernel: xor: automatically using best checksumming function   avx       
Dec  1 04:09:00 np0005540826 kernel: Key type asymmetric registered
Dec  1 04:09:00 np0005540826 kernel: Asymmetric key parser 'x509' registered
Dec  1 04:09:00 np0005540826 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  1 04:09:00 np0005540826 kernel: io scheduler mq-deadline registered
Dec  1 04:09:00 np0005540826 kernel: io scheduler kyber registered
Dec  1 04:09:00 np0005540826 kernel: io scheduler bfq registered
Dec  1 04:09:00 np0005540826 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  1 04:09:00 np0005540826 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  1 04:09:00 np0005540826 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  1 04:09:00 np0005540826 kernel: ACPI: button: Power Button [PWRF]
Dec  1 04:09:00 np0005540826 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  1 04:09:00 np0005540826 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  1 04:09:00 np0005540826 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  1 04:09:00 np0005540826 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  1 04:09:00 np0005540826 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  1 04:09:00 np0005540826 kernel: Non-volatile memory driver v1.3
Dec  1 04:09:00 np0005540826 kernel: rdac: device handler registered
Dec  1 04:09:00 np0005540826 kernel: hp_sw: device handler registered
Dec  1 04:09:00 np0005540826 kernel: emc: device handler registered
Dec  1 04:09:00 np0005540826 kernel: alua: device handler registered
Dec  1 04:09:00 np0005540826 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  1 04:09:00 np0005540826 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  1 04:09:00 np0005540826 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  1 04:09:00 np0005540826 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  1 04:09:00 np0005540826 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  1 04:09:00 np0005540826 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  1 04:09:00 np0005540826 kernel: usb usb1: Product: UHCI Host Controller
Dec  1 04:09:00 np0005540826 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Dec  1 04:09:00 np0005540826 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  1 04:09:00 np0005540826 kernel: hub 1-0:1.0: USB hub found
Dec  1 04:09:00 np0005540826 kernel: hub 1-0:1.0: 2 ports detected
Dec  1 04:09:00 np0005540826 kernel: usbcore: registered new interface driver usbserial_generic
Dec  1 04:09:00 np0005540826 kernel: usbserial: USB Serial support registered for generic
Dec  1 04:09:00 np0005540826 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  1 04:09:00 np0005540826 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  1 04:09:00 np0005540826 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  1 04:09:00 np0005540826 kernel: mousedev: PS/2 mouse device common for all mice
Dec  1 04:09:00 np0005540826 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  1 04:09:00 np0005540826 kernel: rtc_cmos 00:04: registered as rtc0
Dec  1 04:09:00 np0005540826 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  1 04:09:00 np0005540826 kernel: rtc_cmos 00:04: setting system clock to 2025-12-01T09:08:59 UTC (1764580139)
Dec  1 04:09:00 np0005540826 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  1 04:09:00 np0005540826 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  1 04:09:00 np0005540826 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  1 04:09:00 np0005540826 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  1 04:09:00 np0005540826 kernel: usbcore: registered new interface driver usbhid
Dec  1 04:09:00 np0005540826 kernel: usbhid: USB HID core driver
Dec  1 04:09:00 np0005540826 kernel: drop_monitor: Initializing network drop monitor service
Dec  1 04:09:00 np0005540826 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  1 04:09:00 np0005540826 kernel: Initializing XFRM netlink socket
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_INET6 protocol family
Dec  1 04:09:00 np0005540826 kernel: Segment Routing with IPv6
Dec  1 04:09:00 np0005540826 kernel: NET: Registered PF_PACKET protocol family
Dec  1 04:09:00 np0005540826 kernel: mpls_gso: MPLS GSO support
Dec  1 04:09:00 np0005540826 kernel: IPI shorthand broadcast: enabled
Dec  1 04:09:00 np0005540826 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  1 04:09:00 np0005540826 kernel: AES CTR mode by8 optimization enabled
Dec  1 04:09:00 np0005540826 kernel: sched_clock: Marking stable (2508002850, 204326510)->(3004747320, -292417960)
Dec  1 04:09:00 np0005540826 kernel: registered taskstats version 1
Dec  1 04:09:00 np0005540826 kernel: Loading compiled-in X.509 certificates
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  1 04:09:00 np0005540826 kernel: Demotion targets for Node 0: null
Dec  1 04:09:00 np0005540826 kernel: page_owner is disabled
Dec  1 04:09:00 np0005540826 kernel: Key type .fscrypt registered
Dec  1 04:09:00 np0005540826 kernel: Key type fscrypt-provisioning registered
Dec  1 04:09:00 np0005540826 kernel: Key type big_key registered
Dec  1 04:09:00 np0005540826 kernel: Key type encrypted registered
Dec  1 04:09:00 np0005540826 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  1 04:09:00 np0005540826 kernel: Loading compiled-in module X.509 certificates
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 04:09:00 np0005540826 kernel: ima: Allocated hash algorithm: sha256
Dec  1 04:09:00 np0005540826 kernel: ima: No architecture policies found
Dec  1 04:09:00 np0005540826 kernel: evm: Initialising EVM extended attributes:
Dec  1 04:09:00 np0005540826 kernel: evm: security.selinux
Dec  1 04:09:00 np0005540826 kernel: evm: security.SMACK64 (disabled)
Dec  1 04:09:00 np0005540826 kernel: evm: security.SMACK64EXEC (disabled)
Dec  1 04:09:00 np0005540826 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  1 04:09:00 np0005540826 kernel: evm: security.SMACK64MMAP (disabled)
Dec  1 04:09:00 np0005540826 kernel: evm: security.apparmor (disabled)
Dec  1 04:09:00 np0005540826 kernel: evm: security.ima
Dec  1 04:09:00 np0005540826 kernel: evm: security.capability
Dec  1 04:09:00 np0005540826 kernel: evm: HMAC attrs: 0x1
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  1 04:09:00 np0005540826 kernel: Running certificate verification RSA selftest
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  1 04:09:00 np0005540826 kernel: Running certificate verification ECDSA selftest
Dec  1 04:09:00 np0005540826 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  1 04:09:00 np0005540826 kernel: clk: Disabling unused clocks
Dec  1 04:09:00 np0005540826 kernel: Freeing unused decrypted memory: 2028K
Dec  1 04:09:00 np0005540826 kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec  1 04:09:00 np0005540826 kernel: Write protecting the kernel read-only data: 30720k
Dec  1 04:09:00 np0005540826 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Dec  1 04:09:00 np0005540826 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  1 04:09:00 np0005540826 kernel: Run /init as init process
Dec  1 04:09:00 np0005540826 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 04:09:00 np0005540826 systemd: Detected virtualization kvm.
Dec  1 04:09:00 np0005540826 systemd: Detected architecture x86-64.
Dec  1 04:09:00 np0005540826 systemd: Running in initrd.
Dec  1 04:09:00 np0005540826 systemd: No hostname configured, using default hostname.
Dec  1 04:09:00 np0005540826 systemd: Hostname set to <localhost>.
Dec  1 04:09:00 np0005540826 systemd: Initializing machine ID from VM UUID.
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: Manufacturer: QEMU
Dec  1 04:09:00 np0005540826 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  1 04:09:00 np0005540826 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  1 04:09:00 np0005540826 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  1 04:09:00 np0005540826 systemd: Queued start job for default target Initrd Default Target.
Dec  1 04:09:00 np0005540826 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:00 np0005540826 systemd: Reached target Local Encrypted Volumes.
Dec  1 04:09:00 np0005540826 systemd: Reached target Initrd /usr File System.
Dec  1 04:09:00 np0005540826 systemd: Reached target Local File Systems.
Dec  1 04:09:00 np0005540826 systemd: Reached target Path Units.
Dec  1 04:09:00 np0005540826 systemd: Reached target Slice Units.
Dec  1 04:09:00 np0005540826 systemd: Reached target Swaps.
Dec  1 04:09:00 np0005540826 systemd: Reached target Timer Units.
Dec  1 04:09:00 np0005540826 systemd: Listening on D-Bus System Message Bus Socket.
Dec  1 04:09:00 np0005540826 systemd: Listening on Journal Socket (/dev/log).
Dec  1 04:09:00 np0005540826 systemd: Listening on Journal Socket.
Dec  1 04:09:00 np0005540826 systemd: Listening on udev Control Socket.
Dec  1 04:09:00 np0005540826 systemd: Listening on udev Kernel Socket.
Dec  1 04:09:00 np0005540826 systemd: Reached target Socket Units.
Dec  1 04:09:00 np0005540826 systemd: Starting Create List of Static Device Nodes...
Dec  1 04:09:00 np0005540826 systemd: Starting Journal Service...
Dec  1 04:09:00 np0005540826 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 04:09:00 np0005540826 systemd: Starting Apply Kernel Variables...
Dec  1 04:09:00 np0005540826 systemd: Starting Create System Users...
Dec  1 04:09:00 np0005540826 systemd: Starting Setup Virtual Console...
Dec  1 04:09:00 np0005540826 systemd: Finished Create List of Static Device Nodes.
Dec  1 04:09:00 np0005540826 systemd: Finished Apply Kernel Variables.
Dec  1 04:09:00 np0005540826 systemd: Finished Create System Users.
Dec  1 04:09:00 np0005540826 systemd: Starting Create Static Device Nodes in /dev...
Dec  1 04:09:00 np0005540826 systemd-journald[309]: Journal started
Dec  1 04:09:00 np0005540826 systemd-journald[309]: Runtime Journal (/run/log/journal/bbc1bf3e9c614776b76663a97b665391) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:00 np0005540826 systemd-sysusers[313]: Creating group 'users' with GID 100.
Dec  1 04:09:00 np0005540826 systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Dec  1 04:09:00 np0005540826 systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  1 04:09:00 np0005540826 systemd: Started Journal Service.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 04:09:00 np0005540826 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 04:09:00 np0005540826 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 04:09:00 np0005540826 systemd[1]: Finished Setup Virtual Console.
Dec  1 04:09:00 np0005540826 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting dracut cmdline hook...
Dec  1 04:09:00 np0005540826 dracut-cmdline[330]: dracut-9 dracut-057-102.git20250818.el9
Dec  1 04:09:00 np0005540826 dracut-cmdline[330]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540826 systemd[1]: Finished dracut cmdline hook.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting dracut pre-udev hook...
Dec  1 04:09:00 np0005540826 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  1 04:09:00 np0005540826 kernel: device-mapper: uevent: version 1.0.3
Dec  1 04:09:00 np0005540826 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  1 04:09:00 np0005540826 kernel: RPC: Registered named UNIX socket transport module.
Dec  1 04:09:00 np0005540826 kernel: RPC: Registered udp transport module.
Dec  1 04:09:00 np0005540826 kernel: RPC: Registered tcp transport module.
Dec  1 04:09:00 np0005540826 kernel: RPC: Registered tcp-with-tls transport module.
Dec  1 04:09:00 np0005540826 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  1 04:09:00 np0005540826 rpc.statd[448]: Version 2.5.4 starting
Dec  1 04:09:00 np0005540826 rpc.statd[448]: Initializing NSM state
Dec  1 04:09:00 np0005540826 rpc.idmapd[453]: Setting log level to 0
Dec  1 04:09:00 np0005540826 systemd[1]: Finished dracut pre-udev hook.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 04:09:00 np0005540826 systemd-udevd[466]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 04:09:00 np0005540826 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting dracut pre-trigger hook...
Dec  1 04:09:00 np0005540826 systemd[1]: Finished dracut pre-trigger hook.
Dec  1 04:09:00 np0005540826 systemd[1]: Starting Coldplug All udev Devices...
Dec  1 04:09:01 np0005540826 systemd[1]: Created slice Slice /system/modprobe.
Dec  1 04:09:01 np0005540826 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 04:09:01 np0005540826 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 04:09:01 np0005540826 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:01 np0005540826 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:01 np0005540826 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Network.
Dec  1 04:09:01 np0005540826 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 04:09:01 np0005540826 systemd[1]: Starting dracut initqueue hook...
Dec  1 04:09:01 np0005540826 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  1 04:09:01 np0005540826 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  1 04:09:01 np0005540826 kernel: scsi host0: ata_piix
Dec  1 04:09:01 np0005540826 kernel: scsi host1: ata_piix
Dec  1 04:09:01 np0005540826 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  1 04:09:01 np0005540826 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  1 04:09:01 np0005540826 kernel: vda: vda1
Dec  1 04:09:01 np0005540826 systemd[1]: Mounting Kernel Configuration File System...
Dec  1 04:09:01 np0005540826 systemd[1]: Mounted Kernel Configuration File System.
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target System Initialization.
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Basic System.
Dec  1 04:09:01 np0005540826 kernel: ata1: found unknown device (class 0)
Dec  1 04:09:01 np0005540826 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  1 04:09:01 np0005540826 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  1 04:09:01 np0005540826 systemd-udevd[500]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:09:01 np0005540826 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  1 04:09:01 np0005540826 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 04:09:01 np0005540826 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  1 04:09:01 np0005540826 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Initrd Root Device.
Dec  1 04:09:01 np0005540826 systemd[1]: Finished dracut initqueue hook.
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  1 04:09:01 np0005540826 systemd[1]: Reached target Remote File Systems.
Dec  1 04:09:01 np0005540826 systemd[1]: Starting dracut pre-mount hook...
Dec  1 04:09:01 np0005540826 systemd[1]: Finished dracut pre-mount hook.
Dec  1 04:09:01 np0005540826 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Dec  1 04:09:01 np0005540826 systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Dec  1 04:09:01 np0005540826 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 04:09:01 np0005540826 systemd[1]: Mounting /sysroot...
Dec  1 04:09:01 np0005540826 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  1 04:09:01 np0005540826 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Dec  1 04:09:02 np0005540826 kernel: XFS (vda1): Ending clean mount
Dec  1 04:09:02 np0005540826 systemd[1]: Mounted /sysroot.
Dec  1 04:09:02 np0005540826 systemd[1]: Reached target Initrd Root File System.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  1 04:09:02 np0005540826 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  1 04:09:02 np0005540826 systemd[1]: Reached target Initrd File Systems.
Dec  1 04:09:02 np0005540826 systemd[1]: Reached target Initrd Default Target.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting dracut mount hook...
Dec  1 04:09:02 np0005540826 systemd[1]: Finished dracut mount hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  1 04:09:02 np0005540826 rpc.idmapd[453]: exiting on signal 15
Dec  1 04:09:02 np0005540826 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Network.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Timer Units.
Dec  1 04:09:02 np0005540826 systemd[1]: dbus.socket: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Initrd Default Target.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Basic System.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Initrd Root Device.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Initrd /usr File System.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Path Units.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Remote File Systems.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Slice Units.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Socket Units.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target System Initialization.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Local File Systems.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Swaps.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut mount hook.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut pre-mount hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut initqueue hook.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Coldplug All udev Devices.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut pre-trigger hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Setup Virtual Console.
Dec  1 04:09:02 np0005540826 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  1 04:09:02 np0005540826 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Closed udev Control Socket.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Closed udev Kernel Socket.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut pre-udev hook.
Dec  1 04:09:02 np0005540826 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped dracut cmdline hook.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting Cleanup udev Database...
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  1 04:09:02 np0005540826 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  1 04:09:02 np0005540826 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Stopped Create System Users.
Dec  1 04:09:02 np0005540826 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  1 04:09:02 np0005540826 systemd[1]: Finished Cleanup udev Database.
Dec  1 04:09:02 np0005540826 systemd[1]: Reached target Switch Root.
Dec  1 04:09:02 np0005540826 systemd[1]: Starting Switch Root...
Dec  1 04:09:02 np0005540826 systemd[1]: Switching root.
Dec  1 04:09:02 np0005540826 systemd-journald[309]: Journal stopped
Dec  1 04:09:03 np0005540826 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  1 04:09:03 np0005540826 kernel: audit: type=1404 audit(1764580142.458:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:09:03 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:09:03 np0005540826 kernel: audit: type=1403 audit(1764580142.581:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  1 04:09:03 np0005540826 systemd: Successfully loaded SELinux policy in 125.784ms.
Dec  1 04:09:03 np0005540826 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.154ms.
Dec  1 04:09:03 np0005540826 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 04:09:03 np0005540826 systemd: Detected virtualization kvm.
Dec  1 04:09:03 np0005540826 systemd: Detected architecture x86-64.
Dec  1 04:09:03 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:09:03 np0005540826 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd: Stopped Switch Root.
Dec  1 04:09:03 np0005540826 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  1 04:09:03 np0005540826 systemd: Created slice Slice /system/getty.
Dec  1 04:09:03 np0005540826 systemd: Created slice Slice /system/serial-getty.
Dec  1 04:09:03 np0005540826 systemd: Created slice Slice /system/sshd-keygen.
Dec  1 04:09:03 np0005540826 systemd: Created slice User and Session Slice.
Dec  1 04:09:03 np0005540826 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:03 np0005540826 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  1 04:09:03 np0005540826 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  1 04:09:03 np0005540826 systemd: Reached target Local Encrypted Volumes.
Dec  1 04:09:03 np0005540826 systemd: Stopped target Switch Root.
Dec  1 04:09:03 np0005540826 systemd: Stopped target Initrd File Systems.
Dec  1 04:09:03 np0005540826 systemd: Stopped target Initrd Root File System.
Dec  1 04:09:03 np0005540826 systemd: Reached target Local Integrity Protected Volumes.
Dec  1 04:09:03 np0005540826 systemd: Reached target Path Units.
Dec  1 04:09:03 np0005540826 systemd: Reached target rpc_pipefs.target.
Dec  1 04:09:03 np0005540826 systemd: Reached target Slice Units.
Dec  1 04:09:03 np0005540826 systemd: Reached target Swaps.
Dec  1 04:09:03 np0005540826 systemd: Reached target Local Verity Protected Volumes.
Dec  1 04:09:03 np0005540826 systemd: Listening on RPCbind Server Activation Socket.
Dec  1 04:09:03 np0005540826 systemd: Reached target RPC Port Mapper.
Dec  1 04:09:03 np0005540826 systemd: Listening on Process Core Dump Socket.
Dec  1 04:09:03 np0005540826 systemd: Listening on initctl Compatibility Named Pipe.
Dec  1 04:09:03 np0005540826 systemd: Listening on udev Control Socket.
Dec  1 04:09:03 np0005540826 systemd: Listening on udev Kernel Socket.
Dec  1 04:09:03 np0005540826 systemd: Mounting Huge Pages File System...
Dec  1 04:09:03 np0005540826 systemd: Mounting POSIX Message Queue File System...
Dec  1 04:09:03 np0005540826 systemd: Mounting Kernel Debug File System...
Dec  1 04:09:03 np0005540826 systemd: Mounting Kernel Trace File System...
Dec  1 04:09:03 np0005540826 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 04:09:03 np0005540826 systemd: Starting Create List of Static Device Nodes...
Dec  1 04:09:03 np0005540826 systemd: Starting Load Kernel Module configfs...
Dec  1 04:09:03 np0005540826 systemd: Starting Load Kernel Module drm...
Dec  1 04:09:03 np0005540826 systemd: Starting Load Kernel Module efi_pstore...
Dec  1 04:09:03 np0005540826 systemd: Starting Load Kernel Module fuse...
Dec  1 04:09:03 np0005540826 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  1 04:09:03 np0005540826 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd: Stopped File System Check on Root Device.
Dec  1 04:09:03 np0005540826 systemd: Stopped Journal Service.
Dec  1 04:09:03 np0005540826 systemd: Starting Journal Service...
Dec  1 04:09:03 np0005540826 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 04:09:03 np0005540826 systemd: Starting Generate network units from Kernel command line...
Dec  1 04:09:03 np0005540826 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:03 np0005540826 systemd: Starting Remount Root and Kernel File Systems...
Dec  1 04:09:03 np0005540826 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  1 04:09:03 np0005540826 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  1 04:09:03 np0005540826 systemd: Starting Apply Kernel Variables...
Dec  1 04:09:03 np0005540826 systemd: Starting Coldplug All udev Devices...
Dec  1 04:09:03 np0005540826 systemd: Mounted Huge Pages File System.
Dec  1 04:09:03 np0005540826 systemd: Mounted POSIX Message Queue File System.
Dec  1 04:09:03 np0005540826 systemd: Mounted Kernel Debug File System.
Dec  1 04:09:03 np0005540826 systemd-journald[682]: Journal started
Dec  1 04:09:03 np0005540826 systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:03 np0005540826 systemd[1]: Queued start job for default target Multi-User System.
Dec  1 04:09:03 np0005540826 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd: Started Journal Service.
Dec  1 04:09:03 np0005540826 systemd[1]: Mounted Kernel Trace File System.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Create List of Static Device Nodes.
Dec  1 04:09:03 np0005540826 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:03 np0005540826 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Generate network units from Kernel command line.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Apply Kernel Variables.
Dec  1 04:09:03 np0005540826 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Rebuild Hardware Database...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  1 04:09:03 np0005540826 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Load/Save OS Random Seed...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Create System Users...
Dec  1 04:09:03 np0005540826 kernel: ACPI: bus type drm_connector registered
Dec  1 04:09:03 np0005540826 systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:03 np0005540826 systemd-journald[682]: Received client request to flush runtime journal.
Dec  1 04:09:03 np0005540826 kernel: fuse: init (API version 7.37)
Dec  1 04:09:03 np0005540826 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load Kernel Module drm.
Dec  1 04:09:03 np0005540826 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load Kernel Module fuse.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  1 04:09:03 np0005540826 systemd[1]: Mounting FUSE Control File System...
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load/Save OS Random Seed.
Dec  1 04:09:03 np0005540826 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 04:09:03 np0005540826 systemd[1]: Mounted FUSE Control File System.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Create System Users.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target Preparation for Local File Systems.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target Local File Systems.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  1 04:09:03 np0005540826 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  1 04:09:03 np0005540826 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  1 04:09:03 np0005540826 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Automatic Boot Loader Update...
Dec  1 04:09:03 np0005540826 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 04:09:03 np0005540826 bootctl[700]: Couldn't find EFI system partition, skipping.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Automatic Boot Loader Update.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Security Auditing Service...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting RPC Bind...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Rebuild Journal Catalog...
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  1 04:09:03 np0005540826 auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  1 04:09:03 np0005540826 auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  1 04:09:03 np0005540826 systemd[1]: Started RPC Bind.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Rebuild Journal Catalog.
Dec  1 04:09:03 np0005540826 augenrules[711]: /sbin/augenrules: No change
Dec  1 04:09:03 np0005540826 augenrules[726]: No rules
Dec  1 04:09:03 np0005540826 augenrules[726]: enabled 1
Dec  1 04:09:03 np0005540826 augenrules[726]: failure 1
Dec  1 04:09:03 np0005540826 augenrules[726]: pid 706
Dec  1 04:09:03 np0005540826 augenrules[726]: rate_limit 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_limit 8192
Dec  1 04:09:03 np0005540826 augenrules[726]: lost 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog 1
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time 60000
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time_actual 0
Dec  1 04:09:03 np0005540826 augenrules[726]: enabled 1
Dec  1 04:09:03 np0005540826 augenrules[726]: failure 1
Dec  1 04:09:03 np0005540826 augenrules[726]: pid 706
Dec  1 04:09:03 np0005540826 augenrules[726]: rate_limit 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_limit 8192
Dec  1 04:09:03 np0005540826 augenrules[726]: lost 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time 60000
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time_actual 0
Dec  1 04:09:03 np0005540826 augenrules[726]: enabled 1
Dec  1 04:09:03 np0005540826 augenrules[726]: failure 1
Dec  1 04:09:03 np0005540826 augenrules[726]: pid 706
Dec  1 04:09:03 np0005540826 augenrules[726]: rate_limit 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_limit 8192
Dec  1 04:09:03 np0005540826 augenrules[726]: lost 0
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog 1
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time 60000
Dec  1 04:09:03 np0005540826 augenrules[726]: backlog_wait_time_actual 0
Dec  1 04:09:03 np0005540826 systemd[1]: Started Security Auditing Service.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Rebuild Hardware Database.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Update is Completed...
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Update is Completed.
Dec  1 04:09:03 np0005540826 systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 04:09:03 np0005540826 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target System Initialization.
Dec  1 04:09:03 np0005540826 systemd[1]: Started dnf makecache --timer.
Dec  1 04:09:03 np0005540826 systemd[1]: Started Daily rotation of log files.
Dec  1 04:09:03 np0005540826 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target Timer Units.
Dec  1 04:09:03 np0005540826 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  1 04:09:03 np0005540826 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target Socket Units.
Dec  1 04:09:03 np0005540826 systemd[1]: Starting D-Bus System Message Bus...
Dec  1 04:09:03 np0005540826 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 04:09:03 np0005540826 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:03 np0005540826 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:03 np0005540826 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  1 04:09:03 np0005540826 systemd-udevd[742]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:09:03 np0005540826 systemd[1]: Started D-Bus System Message Bus.
Dec  1 04:09:03 np0005540826 systemd[1]: Reached target Basic System.
Dec  1 04:09:03 np0005540826 dbus-broker-lau[744]: Ready
Dec  1 04:09:03 np0005540826 systemd[1]: Starting NTP client/server...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  1 04:09:03 np0005540826 systemd[1]: Starting IPv4 firewall with iptables...
Dec  1 04:09:04 np0005540826 systemd[1]: Started irqbalance daemon.
Dec  1 04:09:04 np0005540826 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  1 04:09:04 np0005540826 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:04 np0005540826 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:04 np0005540826 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:04 np0005540826 systemd[1]: Reached target sshd-keygen.target.
Dec  1 04:09:04 np0005540826 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  1 04:09:04 np0005540826 systemd[1]: Reached target User and Group Name Lookups.
Dec  1 04:09:04 np0005540826 systemd[1]: Starting User Login Management...
Dec  1 04:09:04 np0005540826 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  1 04:09:04 np0005540826 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  1 04:09:04 np0005540826 chronyd[790]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 04:09:04 np0005540826 chronyd[790]: Loaded 0 symmetric keys
Dec  1 04:09:04 np0005540826 chronyd[790]: Using right/UTC timezone to obtain leap second data
Dec  1 04:09:04 np0005540826 chronyd[790]: Loaded seccomp filter (level 2)
Dec  1 04:09:04 np0005540826 systemd[1]: Started NTP client/server.
Dec  1 04:09:04 np0005540826 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 04:09:04 np0005540826 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 04:09:04 np0005540826 systemd-logind[787]: New seat seat0.
Dec  1 04:09:04 np0005540826 systemd[1]: Started User Login Management.
Dec  1 04:09:04 np0005540826 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  1 04:09:04 np0005540826 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  1 04:09:04 np0005540826 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  1 04:09:04 np0005540826 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  1 04:09:04 np0005540826 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  1 04:09:04 np0005540826 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  1 04:09:04 np0005540826 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  1 04:09:04 np0005540826 kernel: kvm_amd: TSC scaling supported
Dec  1 04:09:04 np0005540826 kernel: kvm_amd: Nested Virtualization enabled
Dec  1 04:09:04 np0005540826 kernel: kvm_amd: Nested Paging enabled
Dec  1 04:09:04 np0005540826 kernel: kvm_amd: LBR virtualization supported
Dec  1 04:09:04 np0005540826 kernel: Console: switching to colour dummy device 80x25
Dec  1 04:09:04 np0005540826 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  1 04:09:04 np0005540826 kernel: [drm] features: -context_init
Dec  1 04:09:04 np0005540826 kernel: [drm] number of scanouts: 1
Dec  1 04:09:04 np0005540826 kernel: [drm] number of cap sets: 0
Dec  1 04:09:04 np0005540826 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  1 04:09:04 np0005540826 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  1 04:09:04 np0005540826 kernel: Console: switching to colour frame buffer device 128x48
Dec  1 04:09:04 np0005540826 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  1 04:09:04 np0005540826 cloud-init[814]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 01 Dec 2025 09:09:04 +0000. Up 7.72 seconds.
Dec  1 04:09:04 np0005540826 iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Dec  1 04:09:04 np0005540826 systemd[1]: Finished IPv4 firewall with iptables.
Dec  1 04:09:04 np0005540826 systemd[1]: run-cloud\x2dinit-tmp-tmp2rgzi55s.mount: Deactivated successfully.
Dec  1 04:09:05 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 04:09:05 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 04:09:05 np0005540826 systemd-hostnamed[856]: Hostname set to <np0005540826.novalocal> (static)
Dec  1 04:09:05 np0005540826 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  1 04:09:05 np0005540826 systemd[1]: Reached target Preparation for Network.
Dec  1 04:09:05 np0005540826 systemd[1]: Starting Network Manager...
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2849] NetworkManager (version 1.54.1-1.el9) is starting... (boot:3f13cbd7-efc0-4c84-8ecb-a4cfac3719b9)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2853] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2913] manager[0x559b3b5dc080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2948] hostname: hostname: using hostnamed
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2948] hostname: static hostname changed from (none) to "np0005540826.novalocal"
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.2952] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3052] manager[0x559b3b5dc080]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3054] manager[0x559b3b5dc080]: rfkill: WWAN hardware radio set enabled
Dec  1 04:09:05 np0005540826 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3113] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3113] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3114] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3114] manager: Networking is enabled by state file
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3116] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3124] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3144] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3158] dhcp: init: Using DHCP client 'internal'
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3160] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3174] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3182] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3189] device (lo): Activation: starting connection 'lo' (779fda9d-3aff-418f-a33a-34076793e6c3)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3200] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3203] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3232] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3236] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3239] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3242] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3244] device (eth0): carrier: link connected
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3248] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3253] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3258] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3263] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3265] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3267] manager: NetworkManager state is now CONNECTING
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3269] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3275] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3279] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3320] dhcp4 (eth0): state changed new lease, address=38.102.83.230
Dec  1 04:09:05 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:09:05 np0005540826 systemd[1]: Started Network Manager.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3342] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:09:05 np0005540826 systemd[1]: Reached target Network.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3385] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:09:05 np0005540826 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  1 04:09:05 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3458] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3460] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3466] device (lo): Activation: successful, device activated.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3472] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3474] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3477] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3480] device (eth0): Activation: successful, device activated.
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3484] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:09:05 np0005540826 NetworkManager[860]: <info>  [1764580145.3487] manager: startup complete
Dec  1 04:09:05 np0005540826 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:09:05 np0005540826 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  1 04:09:05 np0005540826 systemd[1]: Starting Cloud-init: Network Stage...
Dec  1 04:09:05 np0005540826 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 04:09:05 np0005540826 systemd[1]: Reached target NFS client services.
Dec  1 04:09:05 np0005540826 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 04:09:05 np0005540826 systemd[1]: Reached target Remote File Systems.
Dec  1 04:09:05 np0005540826 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:05 np0005540826 cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 01 Dec 2025 09:09:05 +0000. Up 8.67 seconds.
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.230         | 255.255.255.0 | global | fa:16:3e:22:69:06 |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe22:6906/64 |       .       |  link  | fa:16:3e:22:69:06 |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  1 04:09:05 np0005540826 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:07 np0005540826 cloud-init[923]: Generating public/private rsa key pair.
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key fingerprint is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: SHA256:F2VUt1Vz2fmJKKTCJgit4cbB/7azE6++iYefnfl5zKU root@np0005540826.novalocal
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key's randomart image is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: +---[RSA 3072]----+
Dec  1 04:09:07 np0005540826 cloud-init[923]: |..         .+...X|
Dec  1 04:09:07 np0005540826 cloud-init[923]: |oo.      . o   +*|
Dec  1 04:09:07 np0005540826 cloud-init[923]: |+o+ .   o . . ..o|
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.= o + . . o . ..|
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.   + . S o      |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |     +   .   .   |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |    o +   o o    |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |   ..=+.o .E     |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |   .+O*+.o.      |
Dec  1 04:09:07 np0005540826 cloud-init[923]: +----[SHA256]-----+
Dec  1 04:09:07 np0005540826 cloud-init[923]: Generating public/private ecdsa key pair.
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key fingerprint is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: SHA256:oodBK8Dq1NllZSMICwN/GtHzYan8ZBMDYAPrG+1i38c root@np0005540826.novalocal
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key's randomart image is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: +---[ECDSA 256]---+
Dec  1 04:09:07 np0005540826 cloud-init[923]: |++=+o..o +       |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.+oo+.* + .      |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o.ooo= *         |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o.o==.B          |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.+o+o=..S        |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o +. +..         |
Dec  1 04:09:07 np0005540826 cloud-init[923]: | = .o o          |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |. o .. E         |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |   . ..          |
Dec  1 04:09:07 np0005540826 cloud-init[923]: +----[SHA256]-----+
Dec  1 04:09:07 np0005540826 cloud-init[923]: Generating public/private ed25519 key pair.
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  1 04:09:07 np0005540826 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key fingerprint is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: SHA256:wKZoq1iXxIGek1Y2AMO/gb4hAzd7cWu/jEbipLFDPvg root@np0005540826.novalocal
Dec  1 04:09:07 np0005540826 cloud-init[923]: The key's randomart image is:
Dec  1 04:09:07 np0005540826 cloud-init[923]: +--[ED25519 256]--+
Dec  1 04:09:07 np0005540826 cloud-init[923]: |+..              |
Dec  1 04:09:07 np0005540826 cloud-init[923]: | o o .           |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |  + = +          |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.ooX.=..         |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o.Oo*o .S        |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o=+=+.+          |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |.==Bo+ .         |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o+*.. .o.        |
Dec  1 04:09:07 np0005540826 cloud-init[923]: |o.Eo .. o.       |
Dec  1 04:09:07 np0005540826 cloud-init[923]: +----[SHA256]-----+
Dec  1 04:09:07 np0005540826 systemd[1]: Finished Cloud-init: Network Stage.
Dec  1 04:09:07 np0005540826 systemd[1]: Reached target Cloud-config availability.
Dec  1 04:09:07 np0005540826 systemd[1]: Reached target Network is Online.
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Cloud-init: Config Stage...
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Crash recovery kernel arming...
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Notify NFS peers of a restart...
Dec  1 04:09:07 np0005540826 systemd[1]: Starting System Logging Service...
Dec  1 04:09:07 np0005540826 sm-notify[1005]: Version 2.5.4 starting
Dec  1 04:09:07 np0005540826 systemd[1]: Starting OpenSSH server daemon...
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Permit User Sessions...
Dec  1 04:09:07 np0005540826 systemd[1]: Started Notify NFS peers of a restart.
Dec  1 04:09:07 np0005540826 systemd[1]: Started OpenSSH server daemon.
Dec  1 04:09:07 np0005540826 systemd[1]: Finished Permit User Sessions.
Dec  1 04:09:07 np0005540826 systemd[1]: Started Command Scheduler.
Dec  1 04:09:07 np0005540826 systemd[1]: Started Getty on tty1.
Dec  1 04:09:07 np0005540826 systemd[1]: Started Serial Getty on ttyS0.
Dec  1 04:09:07 np0005540826 systemd[1]: Reached target Login Prompts.
Dec  1 04:09:07 np0005540826 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Dec  1 04:09:07 np0005540826 systemd[1]: Started System Logging Service.
Dec  1 04:09:07 np0005540826 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  1 04:09:07 np0005540826 systemd[1]: Reached target Multi-User System.
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  1 04:09:07 np0005540826 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  1 04:09:07 np0005540826 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  1 04:09:07 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:09:07 np0005540826 kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Dec  1 04:09:07 np0005540826 kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Dec  1 04:09:07 np0005540826 cloud-init[1239]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 01 Dec 2025 09:09:07 +0000. Up 10.58 seconds.
Dec  1 04:09:07 np0005540826 systemd[1]: Finished Cloud-init: Config Stage.
Dec  1 04:09:07 np0005540826 dracut[1266]: dracut-057-102.git20250818.el9
Dec  1 04:09:07 np0005540826 systemd[1]: Starting Cloud-init: Final Stage...
Dec  1 04:09:07 np0005540826 dracut[1269]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Dec  1 04:09:08 np0005540826 cloud-init[1355]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 01 Dec 2025 09:09:08 +0000. Up 11.04 seconds.
Dec  1 04:09:08 np0005540826 cloud-init[1368]: #############################################################
Dec  1 04:09:08 np0005540826 cloud-init[1370]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  1 04:09:08 np0005540826 cloud-init[1375]: 256 SHA256:oodBK8Dq1NllZSMICwN/GtHzYan8ZBMDYAPrG+1i38c root@np0005540826.novalocal (ECDSA)
Dec  1 04:09:08 np0005540826 cloud-init[1377]: 256 SHA256:wKZoq1iXxIGek1Y2AMO/gb4hAzd7cWu/jEbipLFDPvg root@np0005540826.novalocal (ED25519)
Dec  1 04:09:08 np0005540826 cloud-init[1382]: 3072 SHA256:F2VUt1Vz2fmJKKTCJgit4cbB/7azE6++iYefnfl5zKU root@np0005540826.novalocal (RSA)
Dec  1 04:09:08 np0005540826 cloud-init[1383]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  1 04:09:08 np0005540826 cloud-init[1384]: #############################################################
Dec  1 04:09:08 np0005540826 cloud-init[1355]: Cloud-init v. 24.4-7.el9 finished at Mon, 01 Dec 2025 09:09:08 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.21 seconds
Dec  1 04:09:08 np0005540826 systemd[1]: Finished Cloud-init: Final Stage.
Dec  1 04:09:08 np0005540826 systemd[1]: Reached target Cloud-init target.
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: memstrack is not available
Dec  1 04:09:08 np0005540826 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:08 np0005540826 dracut[1269]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 04:09:09 np0005540826 dracut[1269]: memstrack is not available
Dec  1 04:09:09 np0005540826 dracut[1269]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 04:09:09 np0005540826 dracut[1269]: *** Including module: systemd ***
Dec  1 04:09:09 np0005540826 dracut[1269]: *** Including module: fips ***
Dec  1 04:09:09 np0005540826 dracut[1269]: *** Including module: systemd-initrd ***
Dec  1 04:09:09 np0005540826 dracut[1269]: *** Including module: i18n ***
Dec  1 04:09:10 np0005540826 dracut[1269]: *** Including module: drm ***
Dec  1 04:09:10 np0005540826 chronyd[790]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  1 04:09:10 np0005540826 chronyd[790]: System clock TAI offset set to 37 seconds
Dec  1 04:09:10 np0005540826 dracut[1269]: *** Including module: prefixdevname ***
Dec  1 04:09:10 np0005540826 dracut[1269]: *** Including module: kernel-modules ***
Dec  1 04:09:10 np0005540826 kernel: block vda: the capability attribute has been deprecated.
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: kernel-modules-extra ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: qemu ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: fstab-sys ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: rootfs-block ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: terminfo ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: udev-rules ***
Dec  1 04:09:11 np0005540826 dracut[1269]: Skipping udev rule: 91-permissions.rules
Dec  1 04:09:11 np0005540826 dracut[1269]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: virtiofs ***
Dec  1 04:09:11 np0005540826 dracut[1269]: *** Including module: dracut-systemd ***
Dec  1 04:09:12 np0005540826 dracut[1269]: *** Including module: usrmount ***
Dec  1 04:09:12 np0005540826 dracut[1269]: *** Including module: base ***
Dec  1 04:09:12 np0005540826 dracut[1269]: *** Including module: fs-lib ***
Dec  1 04:09:12 np0005540826 dracut[1269]: *** Including module: kdumpbase ***
Dec  1 04:09:12 np0005540826 dracut[1269]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  1 04:09:12 np0005540826 dracut[1269]:  microcode_ctl module: mangling fw_dir
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel" is ignored
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  1 04:09:12 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  1 04:09:13 np0005540826 dracut[1269]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  1 04:09:13 np0005540826 dracut[1269]: *** Including module: openssl ***
Dec  1 04:09:13 np0005540826 dracut[1269]: *** Including module: shutdown ***
Dec  1 04:09:13 np0005540826 dracut[1269]: *** Including module: squash ***
Dec  1 04:09:13 np0005540826 dracut[1269]: *** Including modules done ***
Dec  1 04:09:13 np0005540826 dracut[1269]: *** Installing kernel module dependencies ***
Dec  1 04:09:14 np0005540826 dracut[1269]: *** Installing kernel module dependencies done ***
Dec  1 04:09:14 np0005540826 dracut[1269]: *** Resolving executable dependencies ***
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 25 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 31 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 28 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 32 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 30 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 irqbalance[785]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  1 04:09:15 np0005540826 irqbalance[785]: IRQ 29 affinity is now unmanaged
Dec  1 04:09:15 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:09:16 np0005540826 dracut[1269]: *** Resolving executable dependencies done ***
Dec  1 04:09:16 np0005540826 dracut[1269]: *** Generating early-microcode cpio image ***
Dec  1 04:09:16 np0005540826 dracut[1269]: *** Store current command line parameters ***
Dec  1 04:09:16 np0005540826 dracut[1269]: Stored kernel commandline:
Dec  1 04:09:16 np0005540826 dracut[1269]: No dracut internal kernel commandline stored in the initramfs
Dec  1 04:09:16 np0005540826 dracut[1269]: *** Install squash loader ***
Dec  1 04:09:17 np0005540826 dracut[1269]: *** Squashing the files inside the initramfs ***
Dec  1 04:09:18 np0005540826 dracut[1269]: *** Squashing the files inside the initramfs done ***
Dec  1 04:09:18 np0005540826 dracut[1269]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Dec  1 04:09:18 np0005540826 dracut[1269]: *** Hardlinking files ***
Dec  1 04:09:18 np0005540826 dracut[1269]: *** Hardlinking files done ***
Dec  1 04:09:18 np0005540826 dracut[1269]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Dec  1 04:09:19 np0005540826 kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Dec  1 04:09:19 np0005540826 kdumpctl[1014]: kdump: Starting kdump: [OK]
Dec  1 04:09:19 np0005540826 systemd[1]: Finished Crash recovery kernel arming.
Dec  1 04:09:19 np0005540826 systemd[1]: Startup finished in 2.853s (kernel) + 2.576s (initrd) + 17.353s (userspace) = 22.783s.
Dec  1 04:09:31 np0005540826 systemd[1]: Created slice User Slice of UID 1000.
Dec  1 04:09:31 np0005540826 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  1 04:09:31 np0005540826 systemd-logind[787]: New session 1 of user zuul.
Dec  1 04:09:31 np0005540826 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  1 04:09:31 np0005540826 systemd[1]: Starting User Manager for UID 1000...
Dec  1 04:09:31 np0005540826 systemd[4305]: Queued start job for default target Main User Target.
Dec  1 04:09:31 np0005540826 systemd[4305]: Created slice User Application Slice.
Dec  1 04:09:31 np0005540826 systemd[4305]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:09:31 np0005540826 systemd[4305]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:09:31 np0005540826 systemd[4305]: Reached target Paths.
Dec  1 04:09:31 np0005540826 systemd[4305]: Reached target Timers.
Dec  1 04:09:31 np0005540826 systemd[4305]: Starting D-Bus User Message Bus Socket...
Dec  1 04:09:31 np0005540826 systemd[4305]: Starting Create User's Volatile Files and Directories...
Dec  1 04:09:31 np0005540826 systemd[4305]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:09:31 np0005540826 systemd[4305]: Reached target Sockets.
Dec  1 04:09:31 np0005540826 systemd[4305]: Finished Create User's Volatile Files and Directories.
Dec  1 04:09:31 np0005540826 systemd[4305]: Reached target Basic System.
Dec  1 04:09:31 np0005540826 systemd[4305]: Reached target Main User Target.
Dec  1 04:09:31 np0005540826 systemd[4305]: Startup finished in 115ms.
Dec  1 04:09:31 np0005540826 systemd[1]: Started User Manager for UID 1000.
Dec  1 04:09:31 np0005540826 systemd[1]: Started Session 1 of User zuul.
Dec  1 04:09:32 np0005540826 python3[4388]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:35 np0005540826 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:09:35 np0005540826 python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:44 np0005540826 python3[4477]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:45 np0005540826 python3[4517]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  1 04:09:47 np0005540826 python3[4543]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs83Me/XJ93JONH+A3ys3BwT4zj02WAeI+PLa+4ictmx5jo+8RBm+8bQesnDGHtSEP3xHjam8Fwfo48sUz5kG1CEXeLWH7xBEXZQ+pidesIq17dWuB2YicfBCHGhZlqb9l/fISdA7PnN5BsCCyr5hQUlvwUPLq0dzE02EgJGcgUqI2ytoS8AvmZ5RX7c4IqGNOi3dFOny3uCDUlNZf/m10t5Eqaq53DNvn55ZT7HmuZuq1QSut2qopHMOrbqUIx17TPb+KiAJG5h8+CV0pJKLq1fSsJaTqR/MZTXsPF5oJHMT5BqnKmRCBNJyY+ko1jZA3a2jF3MqcxIxwgndHOIWitGlByPkFLlWfLV78+yskN9w1nWzxFvEhkCexTCcqU8TmYGBBjKU4l0icf9POdHjr9cZVQmRYdIveeEtZJS0R8S9Tx1uYEuLAXYurVEYBQXuNDw4iQV4pSabQVesX8t9KwUTkxMg2kUXIjvBcHSEiT6wtG+W/j0byNv0sj6FU2EM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:09:47 np0005540826 python3[4567]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:48 np0005540826 python3[4666]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:09:48 np0005540826 python3[4737]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.1232762-252-15170886185495/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa follow=False checksum=c0f0a3fd8bd6e06ffcd4372a522626913bfa295a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:49 np0005540826 python3[4860]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:09:49 np0005540826 python3[4931]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.9854465-307-52666745811701/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa.pub follow=False checksum=0bbaabac56f17c62b907e9f050ef8c82d5faceb9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:51 np0005540826 python3[4979]: ansible-ping Invoked with data=pong
Dec  1 04:09:52 np0005540826 python3[5003]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:54 np0005540826 python3[5061]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  1 04:09:55 np0005540826 python3[5093]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:55 np0005540826 python3[5117]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540826 python3[5141]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540826 python3[5165]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540826 python3[5189]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:57 np0005540826 python3[5213]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:58 np0005540826 python3[5239]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:59 np0005540826 python3[5317]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:00 np0005540826 python3[5390]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580199.1536057-33-25907997670537/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:00 np0005540826 python3[5438]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:00 np0005540826 python3[5462]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540826 python3[5486]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540826 python3[5510]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540826 python3[5534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540826 python3[5558]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540826 python3[5582]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540826 python3[5606]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540826 python3[5630]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540826 python3[5654]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540826 python3[5678]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540826 python3[5702]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540826 python3[5726]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540826 python3[5750]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540826 python3[5774]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540826 python3[5798]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540826 python3[5822]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540826 python3[5846]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540826 python3[5870]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540826 python3[5894]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540826 python3[5918]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540826 python3[5942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540826 python3[5966]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540826 python3[5990]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540826 python3[6014]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540826 python3[6038]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:11 np0005540826 python3[6064]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:10:11 np0005540826 systemd[1]: Starting Time & Date Service...
Dec  1 04:10:11 np0005540826 systemd[1]: Started Time & Date Service.
Dec  1 04:10:11 np0005540826 systemd-timedated[6066]: Changed time zone to 'UTC' (UTC).
Dec  1 04:10:11 np0005540826 python3[6095]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:12 np0005540826 python3[6171]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:12 np0005540826 python3[6242]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764580211.8035426-252-101292409602170/source _original_basename=tmp7m2cv_fd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:12 np0005540826 python3[6342]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:13 np0005540826 python3[6413]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580212.7090168-302-52724642419168/source _original_basename=tmpsukb1bjz follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:14 np0005540826 python3[6515]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:14 np0005540826 python3[6588]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580213.8530166-382-10215683463732/source _original_basename=tmphd3fgnkx follow=False checksum=ec1fff7a2f0c37cc5862f11a9081a375a3f4f428 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:15 np0005540826 python3[6636]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:15 np0005540826 python3[6662]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:15 np0005540826 python3[6742]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:16 np0005540826 python3[6815]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580215.5204477-453-114658198866247/source _original_basename=tmp0xh7ihlt follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:17 np0005540826 python3[6866]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-bfee-2c1a-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:17 np0005540826 python3[6894]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-bfee-2c1a-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  1 04:10:19 np0005540826 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:40 np0005540826 python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:41 np0005540826 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:11:40 np0005540826 systemd-logind[787]: Session 1 logged out. Waiting for processes to exit.
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  1 04:11:45 np0005540826 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  1 04:11:45 np0005540826 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.5818] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:11:45 np0005540826 systemd-udevd[6957]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6021] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6058] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6061] device (eth1): carrier: link connected
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6063] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6069] policy: auto-activating connection 'Wired connection 1' (e33fe8f6-c035-3835-8b4a-b756436fe146)
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6072] device (eth1): Activation: starting connection 'Wired connection 1' (e33fe8f6-c035-3835-8b4a-b756436fe146)
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6073] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6075] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6078] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:11:45 np0005540826 NetworkManager[860]: <info>  [1764580305.6082] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:11:45 np0005540826 systemd[4305]: Starting Mark boot as successful...
Dec  1 04:11:45 np0005540826 systemd[4305]: Finished Mark boot as successful.
Dec  1 04:11:46 np0005540826 systemd-logind[787]: New session 3 of user zuul.
Dec  1 04:11:46 np0005540826 systemd[1]: Started Session 3 of User zuul.
Dec  1 04:11:46 np0005540826 python3[6989]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-4d84-78d9-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:11:56 np0005540826 python3[7069]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:11:57 np0005540826 python3[7142]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580316.6206574-155-280886303727767/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3afcbf522a1f6f2f7ea2230ec48018de1091830d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:11:57 np0005540826 python3[7192]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:11:57 np0005540826 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 04:11:57 np0005540826 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 04:11:57 np0005540826 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7690] caught SIGTERM, shutting down normally.
Dec  1 04:11:57 np0005540826 systemd[1]: Stopping Network Manager...
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7701] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7701] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7702] dhcp4 (eth0): state changed no lease
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7705] manager: NetworkManager state is now CONNECTING
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7833] dhcp4 (eth1): canceled DHCP transaction
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7834] dhcp4 (eth1): state changed no lease
Dec  1 04:11:57 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:11:57 np0005540826 NetworkManager[860]: <info>  [1764580317.7907] exiting (success)
Dec  1 04:11:57 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:11:57 np0005540826 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 04:11:57 np0005540826 systemd[1]: Stopped Network Manager.
Dec  1 04:11:57 np0005540826 systemd[1]: NetworkManager.service: Consumed 1.228s CPU time, 9.9M memory peak.
Dec  1 04:11:57 np0005540826 systemd[1]: Starting Network Manager...
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.8500] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3f13cbd7-efc0-4c84-8ecb-a4cfac3719b9)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.8501] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.8552] manager[0x558f3c3ab070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:11:57 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 04:11:57 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9546] hostname: hostname: using hostnamed
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9547] hostname: static hostname changed from (none) to "np0005540826.novalocal"
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9551] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9556] manager[0x558f3c3ab070]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9557] manager[0x558f3c3ab070]: rfkill: WWAN hardware radio set enabled
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9580] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9580] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9581] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9581] manager: Networking is enabled by state file
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9583] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9586] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9610] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9618] dhcp: init: Using DHCP client 'internal'
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9621] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9625] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9629] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9635] device (lo): Activation: starting connection 'lo' (779fda9d-3aff-418f-a33a-34076793e6c3)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9640] device (eth0): carrier: link connected
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9643] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9646] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9649] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9653] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9657] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9664] device (eth1): carrier: link connected
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9667] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9670] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (e33fe8f6-c035-3835-8b4a-b756436fe146) (indicated)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9671] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9675] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9681] device (eth1): Activation: starting connection 'Wired connection 1' (e33fe8f6-c035-3835-8b4a-b756436fe146)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9685] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:11:57 np0005540826 systemd[1]: Started Network Manager.
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9688] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9689] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9690] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9692] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9693] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9694] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9696] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9697] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9702] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9704] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9710] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9711] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9724] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9726] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9731] device (lo): Activation: successful, device activated.
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9739] dhcp4 (eth0): state changed new lease, address=38.102.83.230
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9745] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9796] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9810] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9812] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9814] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9819] device (eth0): Activation: successful, device activated.
Dec  1 04:11:57 np0005540826 NetworkManager[7204]: <info>  [1764580317.9823] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:11:57 np0005540826 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:11:58 np0005540826 python3[7277]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-4d84-78d9-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:12:08 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:12:27 np0005540826 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0274] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:12:43 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:12:43 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0606] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0609] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0619] device (eth1): Activation: successful, device activated.
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0627] manager: startup complete
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0629] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <warn>  [1764580363.0634] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0644] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0721] dhcp4 (eth1): canceled DHCP transaction
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0722] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0723] dhcp4 (eth1): state changed no lease
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0741] policy: auto-activating connection 'ci-private-network' (9e4c15b1-b624-5fdb-9211-36543aa51b8f)
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0745] device (eth1): Activation: starting connection 'ci-private-network' (9e4c15b1-b624-5fdb-9211-36543aa51b8f)
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0747] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0750] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0760] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0772] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0812] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0814] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:12:43 np0005540826 NetworkManager[7204]: <info>  [1764580363.0821] device (eth1): Activation: successful, device activated.
Dec  1 04:12:53 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:12:58 np0005540826 systemd[1]: session-3.scope: Deactivated successfully.
Dec  1 04:12:58 np0005540826 systemd[1]: session-3.scope: Consumed 1.460s CPU time.
Dec  1 04:12:58 np0005540826 systemd-logind[787]: Session 3 logged out. Waiting for processes to exit.
Dec  1 04:12:58 np0005540826 systemd-logind[787]: Removed session 3.
Dec  1 04:13:34 np0005540826 systemd-logind[787]: New session 4 of user zuul.
Dec  1 04:13:34 np0005540826 systemd[1]: Started Session 4 of User zuul.
Dec  1 04:13:34 np0005540826 python3[7390]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:13:34 np0005540826 python3[7463]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580414.1233523-373-245751750335143/source _original_basename=tmpb_2hjwht follow=False checksum=978dba8c6f7bc0ac5b14f81009c6504f60a75fb7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:13:38 np0005540826 systemd[1]: session-4.scope: Deactivated successfully.
Dec  1 04:13:38 np0005540826 systemd-logind[787]: Session 4 logged out. Waiting for processes to exit.
Dec  1 04:13:38 np0005540826 systemd-logind[787]: Removed session 4.
Dec  1 04:15:15 np0005540826 systemd[4305]: Created slice User Background Tasks Slice.
Dec  1 04:15:15 np0005540826 systemd[4305]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 04:15:15 np0005540826 systemd[4305]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 04:19:32 np0005540826 systemd-logind[787]: New session 5 of user zuul.
Dec  1 04:19:32 np0005540826 systemd[1]: Started Session 5 of User zuul.
Dec  1 04:19:32 np0005540826 python3[7538]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001cdc-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:33 np0005540826 python3[7567]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540826 python3[7593]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540826 python3[7619]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540826 python3[7645]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:35 np0005540826 python3[7671]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:36 np0005540826 python3[7749]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:19:36 np0005540826 python3[7822]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580775.7447836-517-95539251347951/source _original_basename=tmpu5zsphq1 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:38 np0005540826 python3[7872]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:19:38 np0005540826 systemd[1]: Reloading.
Dec  1 04:19:38 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:19:39 np0005540826 python3[7928]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  1 04:19:40 np0005540826 python3[7954]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:40 np0005540826 python3[7982]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:40 np0005540826 python3[8010]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:41 np0005540826 python3[8038]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:41 np0005540826 python3[8065]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001ce3-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:42 np0005540826 python3[8095]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:19:44 np0005540826 systemd[1]: session-5.scope: Deactivated successfully.
Dec  1 04:19:44 np0005540826 systemd[1]: session-5.scope: Consumed 4.118s CPU time.
Dec  1 04:19:44 np0005540826 systemd-logind[787]: Session 5 logged out. Waiting for processes to exit.
Dec  1 04:19:44 np0005540826 systemd-logind[787]: Removed session 5.
Dec  1 04:19:46 np0005540826 systemd-logind[787]: New session 6 of user zuul.
Dec  1 04:19:46 np0005540826 systemd[1]: Started Session 6 of User zuul.
Dec  1 04:19:46 np0005540826 python3[8128]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:20:01 np0005540826 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:01 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:10 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:19 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:21 np0005540826 setsebool[8199]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  1 04:20:21 np0005540826 setsebool[8199]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  1 04:20:32 np0005540826 kernel: SELinux:  Converting 388 SID table entries...
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:32 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:52 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 04:20:52 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:20:52 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:20:52 np0005540826 systemd[1]: Reloading.
Dec  1 04:20:52 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:20:52 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:21:00 np0005540826 python3[14307]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-8f55-108f-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:21:01 np0005540826 kernel: evm: overlay not supported
Dec  1 04:21:01 np0005540826 systemd[4305]: Starting D-Bus User Message Bus...
Dec  1 04:21:01 np0005540826 dbus-broker-launch[14861]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  1 04:21:01 np0005540826 dbus-broker-launch[14861]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  1 04:21:01 np0005540826 systemd[4305]: Started D-Bus User Message Bus.
Dec  1 04:21:01 np0005540826 dbus-broker-lau[14861]: Ready
Dec  1 04:21:01 np0005540826 systemd[4305]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 04:21:01 np0005540826 systemd[4305]: Created slice Slice /user.
Dec  1 04:21:01 np0005540826 systemd[4305]: podman-14789.scope: unit configures an IP firewall, but not running as root.
Dec  1 04:21:01 np0005540826 systemd[4305]: (This warning is only shown for the first unit using IP firewalling.)
Dec  1 04:21:01 np0005540826 systemd[4305]: Started podman-14789.scope.
Dec  1 04:21:01 np0005540826 systemd[4305]: Started podman-pause-6dc53e6b.scope.
Dec  1 04:21:02 np0005540826 python3[15337]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.51:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.51:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:02 np0005540826 python3[15337]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  1 04:21:03 np0005540826 systemd[1]: session-6.scope: Deactivated successfully.
Dec  1 04:21:03 np0005540826 systemd[1]: session-6.scope: Consumed 1min 2.627s CPU time.
Dec  1 04:21:03 np0005540826 systemd-logind[787]: Session 6 logged out. Waiting for processes to exit.
Dec  1 04:21:03 np0005540826 systemd-logind[787]: Removed session 6.
Dec  1 04:21:05 np0005540826 irqbalance[785]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  1 04:21:05 np0005540826 irqbalance[785]: IRQ 27 affinity is now unmanaged
Dec  1 04:21:27 np0005540826 systemd-logind[787]: New session 7 of user zuul.
Dec  1 04:21:27 np0005540826 systemd[1]: Started Session 7 of User zuul.
Dec  1 04:21:28 np0005540826 python3[26199]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:28 np0005540826 python3[26442]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:29 np0005540826 python3[26929]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005540826.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  1 04:21:30 np0005540826 python3[27265]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:30 np0005540826 python3[27572]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:21:31 np0005540826 python3[27873]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580890.558842-168-132466594099433/source _original_basename=tmp2jqd0fx_ follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:33 np0005540826 python3[28854]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec  1 04:21:33 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 04:21:33 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 04:21:33 np0005540826 systemd-hostnamed[28985]: Changed pretty hostname to 'compute-1'
Dec  1 04:21:33 np0005540826 systemd-hostnamed[28985]: Hostname set to <compute-1> (static)
Dec  1 04:21:33 np0005540826 NetworkManager[7204]: <info>  [1764580893.8941] hostname: static hostname changed from "np0005540826.novalocal" to "compute-1"
Dec  1 04:21:33 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:21:33 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:21:34 np0005540826 systemd[1]: session-7.scope: Deactivated successfully.
Dec  1 04:21:34 np0005540826 systemd[1]: session-7.scope: Consumed 2.164s CPU time.
Dec  1 04:21:34 np0005540826 systemd-logind[787]: Session 7 logged out. Waiting for processes to exit.
Dec  1 04:21:34 np0005540826 systemd-logind[787]: Removed session 7.
Dec  1 04:21:36 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:21:36 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:21:36 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 52.668s CPU time.
Dec  1 04:21:36 np0005540826 systemd[1]: run-r09e3ca6f67be4e5488a6abac137a8544.service: Deactivated successfully.
Dec  1 04:21:43 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:22:03 np0005540826 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:24:15 np0005540826 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  1 04:24:15 np0005540826 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  1 04:24:15 np0005540826 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  1 04:24:15 np0005540826 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  1 04:25:51 np0005540826 systemd-logind[787]: New session 8 of user zuul.
Dec  1 04:25:51 np0005540826 systemd[1]: Started Session 8 of User zuul.
Dec  1 04:25:51 np0005540826 python3[30102]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:25:53 np0005540826 python3[30218]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:54 np0005540826 python3[30291]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:54 np0005540826 python3[30317]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:54 np0005540826 python3[30390]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:55 np0005540826 python3[30416]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:55 np0005540826 python3[30489]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:55 np0005540826 python3[30515]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:56 np0005540826 python3[30588]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:56 np0005540826 python3[30614]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:56 np0005540826 python3[30687]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:56 np0005540826 python3[30713]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:57 np0005540826 python3[30786]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:57 np0005540826 python3[30812]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:57 np0005540826 python3[30885]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.5886774-34001-185259775728360/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:26:11 np0005540826 python3[30934]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:31:10 np0005540826 systemd-logind[787]: Session 8 logged out. Waiting for processes to exit.
Dec  1 04:31:10 np0005540826 systemd[1]: session-8.scope: Deactivated successfully.
Dec  1 04:31:10 np0005540826 systemd[1]: session-8.scope: Consumed 5.072s CPU time.
Dec  1 04:31:10 np0005540826 systemd-logind[787]: Removed session 8.
Dec  1 04:38:03 np0005540826 systemd-logind[787]: New session 9 of user zuul.
Dec  1 04:38:03 np0005540826 systemd[1]: Started Session 9 of User zuul.
Dec  1 04:38:05 np0005540826 python3.9[31104]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:06 np0005540826 python3.9[31285]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:38:14 np0005540826 systemd[1]: session-9.scope: Deactivated successfully.
Dec  1 04:38:14 np0005540826 systemd[1]: session-9.scope: Consumed 8.248s CPU time.
Dec  1 04:38:14 np0005540826 systemd-logind[787]: Session 9 logged out. Waiting for processes to exit.
Dec  1 04:38:14 np0005540826 systemd-logind[787]: Removed session 9.
Dec  1 04:38:29 np0005540826 systemd-logind[787]: New session 10 of user zuul.
Dec  1 04:38:29 np0005540826 systemd[1]: Started Session 10 of User zuul.
Dec  1 04:38:30 np0005540826 python3.9[31495]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:38:31 np0005540826 python3.9[31669]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:32 np0005540826 python3.9[31821]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:38:33 np0005540826 python3.9[31974]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:38:34 np0005540826 python3.9[32126]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:35 np0005540826 python3.9[32279]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:38:36 np0005540826 python3.9[32402]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581915.0065947-178-110151135120588/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:37 np0005540826 python3.9[32554]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:38 np0005540826 python3.9[32710]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:38:38 np0005540826 python3.9[32862]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:38:39 np0005540826 python3.9[33012]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:38:44 np0005540826 python3.9[33265]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:45 np0005540826 python3.9[33415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:47 np0005540826 python3.9[33569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:48 np0005540826 python3.9[33727]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:38:49 np0005540826 python3.9[33811]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:39:33 np0005540826 systemd[1]: Starting dnf makecache...
Dec  1 04:39:33 np0005540826 systemd[1]: Reloading.
Dec  1 04:39:33 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:33 np0005540826 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  1 04:39:33 np0005540826 dnf[33983]: Failed determining last makecache time.
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-barbican-42b4c41831408a8e323 126 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 165 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-cinder-1c00d6490d88e436f26ef 186 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-stevedore-c4acc5639fd2329372142 178 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-cloudkitty-tests-tempest-2c80f8 189 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 192 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 195 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-designate-tests-tempest-347fdbc 161 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 systemd[1]: Reloading.
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-glance-1fd12c29b339f30fe823e 159 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 146 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-manila-3c01b7181572c95dac462 197 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-whitebox-neutron-tests-tempest- 172 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-octavia-ba397f07a7331190208c 189 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-watcher-c014f81a8647287f6dcc 195 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-ansible-config_template-5ccaa22121a7ff 190 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 193 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-swift-dc98a8463506ac520c469a 160 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-python-tempestconf-8515371b7cceebd4282 188 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 dnf[33983]: delorean-openstack-heat-ui-013accbfd179753bc3f0 196 kB/s | 3.0 kB     00:00
Dec  1 04:39:33 np0005540826 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  1 04:39:34 np0005540826 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  1 04:39:34 np0005540826 systemd[1]: Reloading.
Dec  1 04:39:34 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:34 np0005540826 dnf[33983]: CentOS Stream 9 - BaseOS                         25 kB/s | 7.3 kB     00:00
Dec  1 04:39:34 np0005540826 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  1 04:39:34 np0005540826 dnf[33983]: CentOS Stream 9 - AppStream                      62 kB/s | 7.4 kB     00:00
Dec  1 04:39:34 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 04:39:34 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 04:39:34 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 04:39:34 np0005540826 dnf[33983]: CentOS Stream 9 - CRB                            82 kB/s | 7.2 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: CentOS Stream 9 - Extras packages                84 kB/s | 8.3 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: dlrn-antelope-testing                           171 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: dlrn-antelope-build-deps                        177 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: centos9-rabbitmq                                128 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: centos9-storage                                 128 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: centos9-opstools                                132 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: NFV SIG OpenvSwitch                             144 kB/s | 3.0 kB     00:00
Dec  1 04:39:34 np0005540826 dnf[33983]: repo-setup-centos-appstream                     165 kB/s | 4.4 kB     00:00
Dec  1 04:39:36 np0005540826 dnf[33983]: repo-setup-centos-baseos                        3.2 kB/s | 3.9 kB     00:01
Dec  1 04:39:36 np0005540826 dnf[33983]: repo-setup-centos-highavailability              187 kB/s | 3.9 kB     00:00
Dec  1 04:39:36 np0005540826 dnf[33983]: repo-setup-centos-powertools                    192 kB/s | 4.3 kB     00:00
Dec  1 04:39:36 np0005540826 dnf[33983]: Extra Packages for Enterprise Linux 9 - x86_64  249 kB/s |  30 kB     00:00
Dec  1 04:39:37 np0005540826 dnf[33983]: Metadata cache created.
Dec  1 04:39:37 np0005540826 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  1 04:39:37 np0005540826 systemd[1]: Finished dnf makecache.
Dec  1 04:39:37 np0005540826 systemd[1]: dnf-makecache.service: Consumed 1.821s CPU time.
Dec  1 04:40:37 np0005540826 kernel: SELinux:  Converting 2718 SID table entries...
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:40:37 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:40:37 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  1 04:40:37 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:40:37 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:40:37 np0005540826 systemd[1]: Reloading.
Dec  1 04:40:37 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:40:37 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:40:38 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:40:38 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:40:38 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 1.245s CPU time.
Dec  1 04:40:38 np0005540826 systemd[1]: run-ra94664fe0bca4d6695147d8ed2ff2a6a.service: Deactivated successfully.
Dec  1 04:40:38 np0005540826 python3.9[35376]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:40:41 np0005540826 python3.9[35657]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:40:42 np0005540826 python3.9[35809]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:40:44 np0005540826 python3.9[35963]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:45 np0005540826 python3.9[36115]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:40:47 np0005540826 python3.9[36267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:40:48 np0005540826 python3.9[36419]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:40:48 np0005540826 python3.9[36542]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582047.571525-667-72801788392315/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:53 np0005540826 python3.9[36694]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:40:55 np0005540826 python3.9[36846]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:40:56 np0005540826 python3.9[36999]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:57 np0005540826 python3.9[37151]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:40:57 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:40:57 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:40:58 np0005540826 python3.9[37305]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:40:59 np0005540826 python3.9[37463]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:41:00 np0005540826 python3.9[37623]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:41:01 np0005540826 python3.9[37776]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:41:02 np0005540826 python3.9[37934]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:41:03 np0005540826 python3.9[38086]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:41:06 np0005540826 python3.9[38239]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:06 np0005540826 python3.9[38391]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:41:07 np0005540826 python3.9[38514]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582066.5223596-1024-56567774036704/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:08 np0005540826 python3.9[38666]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:41:08 np0005540826 systemd[1]: Starting Load Kernel Modules...
Dec  1 04:41:09 np0005540826 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  1 04:41:09 np0005540826 kernel: Bridge firewalling registered
Dec  1 04:41:09 np0005540826 systemd-modules-load[38670]: Inserted module 'br_netfilter'
Dec  1 04:41:09 np0005540826 systemd[1]: Finished Load Kernel Modules.
Dec  1 04:41:09 np0005540826 python3.9[38825]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:41:10 np0005540826 python3.9[38948]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582069.3251743-1094-241768348916041/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:11 np0005540826 python3.9[39100]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:41:15 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 04:41:15 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 04:41:15 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:41:15 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:41:15 np0005540826 systemd[1]: Reloading.
Dec  1 04:41:15 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:15 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:41:17 np0005540826 python3.9[40958]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:41:18 np0005540826 python3.9[42486]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:41:19 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:41:19 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:41:19 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 4.828s CPU time.
Dec  1 04:41:19 np0005540826 systemd[1]: run-ra715141cc69d424b87c1a20a8a9ed9ff.service: Deactivated successfully.
Dec  1 04:41:19 np0005540826 python3.9[43108]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:41:20 np0005540826 python3.9[43260]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:20 np0005540826 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:41:20 np0005540826 systemd[1]: Starting Authorization Manager...
Dec  1 04:41:21 np0005540826 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:41:21 np0005540826 polkitd[43477]: Started polkitd version 0.117
Dec  1 04:41:21 np0005540826 systemd[1]: Started Authorization Manager.
Dec  1 04:41:22 np0005540826 python3.9[43647]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:22 np0005540826 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:41:22 np0005540826 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:41:22 np0005540826 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:41:22 np0005540826 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:41:22 np0005540826 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:41:23 np0005540826 python3.9[43809]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:41:27 np0005540826 python3.9[43961]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:27 np0005540826 systemd[1]: Reloading.
Dec  1 04:41:27 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:28 np0005540826 python3.9[44149]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:28 np0005540826 systemd[1]: Reloading.
Dec  1 04:41:28 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:29 np0005540826 python3.9[44338]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:30 np0005540826 python3.9[44491]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:30 np0005540826 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  1 04:41:30 np0005540826 python3.9[44644]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:33 np0005540826 python3.9[44806]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:33 np0005540826 python3.9[44959]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:41:33 np0005540826 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 04:41:33 np0005540826 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 04:41:33 np0005540826 systemd[1]: Stopping Apply Kernel Variables...
Dec  1 04:41:33 np0005540826 systemd[1]: Starting Apply Kernel Variables...
Dec  1 04:41:33 np0005540826 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 04:41:33 np0005540826 systemd[1]: Finished Apply Kernel Variables.
Dec  1 04:41:34 np0005540826 systemd[1]: session-10.scope: Deactivated successfully.
Dec  1 04:41:34 np0005540826 systemd[1]: session-10.scope: Consumed 2min 15.506s CPU time.
Dec  1 04:41:34 np0005540826 systemd-logind[787]: Session 10 logged out. Waiting for processes to exit.
Dec  1 04:41:34 np0005540826 systemd-logind[787]: Removed session 10.
Dec  1 04:41:40 np0005540826 systemd-logind[787]: New session 11 of user zuul.
Dec  1 04:41:40 np0005540826 systemd[1]: Started Session 11 of User zuul.
Dec  1 04:41:41 np0005540826 python3.9[45142]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:41:42 np0005540826 python3.9[45298]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:41:43 np0005540826 python3.9[45451]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:41:44 np0005540826 python3.9[45609]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:41:45 np0005540826 python3.9[45769]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:41:46 np0005540826 python3.9[45853]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:41:50 np0005540826 python3.9[46017]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:01 np0005540826 kernel: SELinux:  Converting 2730 SID table entries...
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:42:01 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:42:02 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  1 04:42:02 np0005540826 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  1 04:42:03 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:03 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:03 np0005540826 systemd[1]: Reloading.
Dec  1 04:42:03 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:03 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:03 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:04 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:04 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:04 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 1.039s CPU time.
Dec  1 04:42:04 np0005540826 systemd[1]: run-r815d156920314d4e81f40b9f35017d72.service: Deactivated successfully.
Dec  1 04:42:08 np0005540826 python3.9[47116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:42:08 np0005540826 systemd[1]: Reloading.
Dec  1 04:42:08 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:08 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:08 np0005540826 systemd[1]: Starting Open vSwitch Database Unit...
Dec  1 04:42:08 np0005540826 chown[47158]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  1 04:42:08 np0005540826 ovs-ctl[47163]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  1 04:42:08 np0005540826 ovs-ctl[47163]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  1 04:42:08 np0005540826 ovs-ctl[47163]: Starting ovsdb-server [  OK  ]
Dec  1 04:42:08 np0005540826 ovs-vsctl[47212]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  1 04:42:08 np0005540826 ovs-vsctl[47228]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"b99910e3-15ec-4cc7-b887-f5229f22d165\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  1 04:42:08 np0005540826 ovs-ctl[47163]: Configuring Open vSwitch system IDs [  OK  ]
Dec  1 04:42:08 np0005540826 ovs-vsctl[47237]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  1 04:42:09 np0005540826 ovs-ctl[47163]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:42:09 np0005540826 systemd[1]: Started Open vSwitch Database Unit.
Dec  1 04:42:09 np0005540826 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  1 04:42:09 np0005540826 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  1 04:42:09 np0005540826 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  1 04:42:09 np0005540826 kernel: openvswitch: Open vSwitch switching datapath
Dec  1 04:42:09 np0005540826 ovs-ctl[47282]: Inserting openvswitch module [  OK  ]
Dec  1 04:42:09 np0005540826 ovs-ctl[47251]: Starting ovs-vswitchd [  OK  ]
Dec  1 04:42:09 np0005540826 ovs-vsctl[47300]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  1 04:42:09 np0005540826 ovs-ctl[47251]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:42:09 np0005540826 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  1 04:42:09 np0005540826 systemd[1]: Starting Open vSwitch...
Dec  1 04:42:09 np0005540826 systemd[1]: Finished Open vSwitch.
Dec  1 04:42:10 np0005540826 python3.9[47451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:42:11 np0005540826 python3.9[47603]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:42:12 np0005540826 kernel: SELinux:  Converting 2744 SID table entries...
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:42:12 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:42:13 np0005540826 python3.9[47758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:42:14 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  1 04:42:14 np0005540826 python3.9[47916]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:17 np0005540826 python3.9[48069]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:42:18 np0005540826 python3.9[48356]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:42:19 np0005540826 python3.9[48506]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:42:20 np0005540826 python3.9[48660]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:22 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:22 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:22 np0005540826 systemd[1]: Reloading.
Dec  1 04:42:22 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:22 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:22 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:22 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:22 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:22 np0005540826 systemd[1]: run-re0802271adb641aa825983b0904d023c.service: Deactivated successfully.
Dec  1 04:42:24 np0005540826 python3.9[48977]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:42:24 np0005540826 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 04:42:24 np0005540826 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 04:42:24 np0005540826 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 04:42:24 np0005540826 systemd[1]: Stopping Network Manager...
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1693] caught SIGTERM, shutting down normally.
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1708] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1708] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1709] dhcp4 (eth0): state changed no lease
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1710] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:42:24 np0005540826 NetworkManager[7204]: <info>  [1764582144.1777] exiting (success)
Dec  1 04:42:24 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:42:24 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:42:24 np0005540826 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 04:42:24 np0005540826 systemd[1]: Stopped Network Manager.
Dec  1 04:42:24 np0005540826 systemd[1]: NetworkManager.service: Consumed 14.192s CPU time, 4.1M memory peak, read 0B from disk, written 20.0K to disk.
Dec  1 04:42:24 np0005540826 systemd[1]: Starting Network Manager...
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.2455] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3f13cbd7-efc0-4c84-8ecb-a4cfac3719b9)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.2456] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.2517] manager[0x55d36461b090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:42:24 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 04:42:24 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3531] hostname: hostname: using hostnamed
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3533] hostname: static hostname changed from (none) to "compute-1"
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3539] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3545] manager[0x55d36461b090]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3546] manager[0x55d36461b090]: rfkill: WWAN hardware radio set enabled
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3575] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3588] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3589] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3589] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3590] manager: Networking is enabled by state file
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3593] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3598] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3624] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3632] dhcp: init: Using DHCP client 'internal'
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3634] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3638] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3643] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3649] device (lo): Activation: starting connection 'lo' (779fda9d-3aff-418f-a33a-34076793e6c3)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3655] device (eth0): carrier: link connected
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3658] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3663] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3663] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3668] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3674] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3679] device (eth1): carrier: link connected
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3682] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3685] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (9e4c15b1-b624-5fdb-9211-36543aa51b8f) (indicated)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3686] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3690] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3696] device (eth1): Activation: starting connection 'ci-private-network' (9e4c15b1-b624-5fdb-9211-36543aa51b8f)
Dec  1 04:42:24 np0005540826 systemd[1]: Started Network Manager.
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3712] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3727] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3730] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3731] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3733] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3735] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3737] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3738] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3741] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3745] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3747] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3776] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3787] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3792] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3794] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3797] device (lo): Activation: successful, device activated.
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3802] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3804] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3807] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3809] device (eth1): Activation: successful, device activated.
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3818] dhcp4 (eth0): state changed new lease, address=38.102.83.230
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3823] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3891] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3915] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3916] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3918] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:42:24 np0005540826 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3922] device (eth0): Activation: successful, device activated.
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3926] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:42:24 np0005540826 NetworkManager[48989]: <info>  [1764582144.3929] manager: startup complete
Dec  1 04:42:24 np0005540826 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:42:25 np0005540826 python3.9[49204]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:29 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:29 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:29 np0005540826 systemd[1]: Reloading.
Dec  1 04:42:29 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:29 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:29 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:30 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:30 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:30 np0005540826 systemd[1]: run-r39b5fbe079174e7694a0e87e22eea31a.service: Deactivated successfully.
Dec  1 04:42:33 np0005540826 python3.9[49664]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:42:33 np0005540826 python3.9[49816]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:34 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:42:34 np0005540826 python3.9[49970]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:35 np0005540826 python3.9[50122]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:36 np0005540826 python3.9[50274]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:36 np0005540826 python3.9[50426]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:37 np0005540826 python3.9[50578]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:38 np0005540826 python3.9[50701]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582156.9434507-648-136484318319846/.source _original_basename=.r307ph9x follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:38 np0005540826 python3.9[50853]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:39 np0005540826 python3.9[51005]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  1 04:42:40 np0005540826 python3.9[51157]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:42 np0005540826 python3.9[51584]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  1 04:42:44 np0005540826 ansible-async_wrapper.py[51759]: Invoked with j321333549450 300 /home/zuul/.ansible/tmp/ansible-tmp-1764582163.144568-846-192250173865611/AnsiballZ_edpm_os_net_config.py _
Dec  1 04:42:44 np0005540826 ansible-async_wrapper.py[51762]: Starting module and watcher
Dec  1 04:42:44 np0005540826 ansible-async_wrapper.py[51762]: Start watching 51763 (300)
Dec  1 04:42:44 np0005540826 ansible-async_wrapper.py[51763]: Start module (51763)
Dec  1 04:42:44 np0005540826 ansible-async_wrapper.py[51759]: Return async_wrapper task started.
Dec  1 04:42:44 np0005540826 python3.9[51764]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  1 04:42:44 np0005540826 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  1 04:42:44 np0005540826 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  1 04:42:44 np0005540826 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  1 04:42:44 np0005540826 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  1 04:42:44 np0005540826 kernel: cfg80211: failed to load regulatory.db
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.2912] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.2940] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3874] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3878] audit: op="connection-add" uuid="e6c1c25f-7e38-4077-8465-e0ea32ebd916" name="br-ex-br" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3907] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3910] audit: op="connection-add" uuid="1f7b8df1-a378-4d2f-b196-ea50a524d6b8" name="br-ex-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3935] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3937] audit: op="connection-add" uuid="1ba8141d-bc30-4e46-a417-972d9bf7113b" name="eth1-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3961] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3964] audit: op="connection-add" uuid="6c4a196a-cc24-45dc-9f6d-9dc092849453" name="vlan20-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3987] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.3989] audit: op="connection-add" uuid="790575c7-47e3-41f4-8c97-efeb36918879" name="vlan21-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4010] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4012] audit: op="connection-add" uuid="c8f16bc3-8954-4a06-9c87-ce3ca1f0b86a" name="vlan22-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4033] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4036] audit: op="connection-add" uuid="7de54a39-e4b6-4eff-a351-7458f731981d" name="vlan23-port" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4079] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4113] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4115] audit: op="connection-add" uuid="62c171d3-a7da-48ce-b89b-6e6cca9859b7" name="br-ex-if" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4192] audit: op="connection-update" uuid="9e4c15b1-b624-5fdb-9211-36543aa51b8f" name="ci-private-network" args="ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.method,ipv4.routing-rules,ipv4.never-default,ovs-interface.type,connection.master,connection.slave-type,connection.controller,connection.port-type,connection.timestamp,ovs-external-ids.data,ipv6.dns,ipv6.routes,ipv6.addresses,ipv6.method,ipv6.addr-gen-mode,ipv6.routing-rules" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4226] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4230] audit: op="connection-add" uuid="bbbd9539-26e1-462e-b214-f5be240039a9" name="vlan20-if" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4262] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4266] audit: op="connection-add" uuid="bcc50a09-2249-43b8-948c-b8a82a498a09" name="vlan21-if" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4298] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4301] audit: op="connection-add" uuid="5b234d4e-bd6c-4a41-b9a9-82a40bf31ca6" name="vlan22-if" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4332] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4335] audit: op="connection-add" uuid="ed94c6d2-e040-48a1-a394-4031a8b09f21" name="vlan23-if" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4355] audit: op="connection-delete" uuid="e33fe8f6-c035-3835-8b4a-b756436fe146" name="Wired connection 1" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4377] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4394] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4398] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e6c1c25f-7e38-4077-8465-e0ea32ebd916)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4399] audit: op="connection-activate" uuid="e6c1c25f-7e38-4077-8465-e0ea32ebd916" name="br-ex-br" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4401] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4411] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4416] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (1f7b8df1-a378-4d2f-b196-ea50a524d6b8)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4419] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4425] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4429] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1ba8141d-bc30-4e46-a417-972d9bf7113b)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4431] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4437] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4441] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (6c4a196a-cc24-45dc-9f6d-9dc092849453)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4443] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4451] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4455] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (790575c7-47e3-41f4-8c97-efeb36918879)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4456] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4463] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4467] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (c8f16bc3-8954-4a06-9c87-ce3ca1f0b86a)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4469] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4477] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4481] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (7de54a39-e4b6-4eff-a351-7458f731981d)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4481] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4484] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4487] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4495] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4501] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4506] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (62c171d3-a7da-48ce-b89b-6e6cca9859b7)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4507] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4509] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4511] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4512] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4513] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4524] device (eth1): disconnecting for new activation request.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4525] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4528] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4530] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4531] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4533] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4539] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4544] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (bbbd9539-26e1-462e-b214-f5be240039a9)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4545] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4549] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4551] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4553] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4556] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4562] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4567] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (bcc50a09-2249-43b8-948c-b8a82a498a09)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4568] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4571] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4573] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4575] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4578] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4584] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4588] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (5b234d4e-bd6c-4a41-b9a9-82a40bf31ca6)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4589] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4592] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4594] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4595] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4598] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4603] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4607] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (ed94c6d2-e040-48a1-a394-4031a8b09f21)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4608] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4611] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4613] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4615] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4617] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4633] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4634] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4637] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4639] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4646] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4650] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4654] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4657] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4659] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4664] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4668] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4671] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4673] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4678] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 kernel: ovs-system: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4682] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4686] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4688] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4694] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4698] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4702] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4704] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4710] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4715] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4715] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4715] dhcp4 (eth0): state changed no lease
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4717] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  1 04:42:46 np0005540826 systemd-udevd[51771]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:42:46 np0005540826 kernel: Timeout policy base is empty
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4731] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4737] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51765 uid=0 result="fail" reason="Device is not activated"
Dec  1 04:42:46 np0005540826 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4783] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4789] dhcp4 (eth0): state changed new lease, address=38.102.83.230
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4794] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4839] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4852] device (eth1): disconnecting for new activation request.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4853] audit: op="connection-activate" uuid="9e4c15b1-b624-5fdb-9211-36543aa51b8f" name="ci-private-network" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4855] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4904] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51765 uid=0 result="success"
Dec  1 04:42:46 np0005540826 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.4928] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5103] device (eth1): Activation: starting connection 'ci-private-network' (9e4c15b1-b624-5fdb-9211-36543aa51b8f)
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5110] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5123] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5129] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5138] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 kernel: br-ex: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5145] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5153] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5155] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5157] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5159] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5161] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5163] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5176] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5184] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5190] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5195] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5201] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5206] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5212] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5217] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5223] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5228] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5234] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5238] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5243] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5253] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5259] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 kernel: vlan22: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5332] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5339] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5346] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5355] device (eth1): Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5375] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 kernel: vlan23: entered promiscuous mode
Dec  1 04:42:46 np0005540826 systemd-udevd[51769]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5402] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5404] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5414] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5465] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540826 kernel: vlan21: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5501] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5505] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5525] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5531] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5536] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5543] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5553] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5555] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5562] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 kernel: vlan20: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5616] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5633] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5671] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5673] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5678] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5715] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5724] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5743] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5744] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540826 NetworkManager[48989]: <info>  [1764582166.5750] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:47 np0005540826 NetworkManager[48989]: <info>  [1764582167.7018] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51765 uid=0 result="success"
Dec  1 04:42:47 np0005540826 python3.9[52122]: ansible-ansible.legacy.async_status Invoked with jid=j321333549450.51759 mode=status _async_dir=/root/.ansible_async
Dec  1 04:42:47 np0005540826 NetworkManager[48989]: <info>  [1764582167.8412] checkpoint[0x55d3645f0950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  1 04:42:47 np0005540826 NetworkManager[48989]: <info>  [1764582167.8414] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.1491] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.1504] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.3693] audit: op="networking-control" arg="global-dns-configuration" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.3722] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.3748] audit: op="networking-control" arg="global-dns-configuration" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.3772] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.5489] checkpoint[0x55d3645f0a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  1 04:42:48 np0005540826 NetworkManager[48989]: <info>  [1764582168.5493] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51765 uid=0 result="success"
Dec  1 04:42:48 np0005540826 ansible-async_wrapper.py[51763]: Module complete (51763)
Dec  1 04:42:49 np0005540826 ansible-async_wrapper.py[51762]: Done in kid B.
Dec  1 04:42:51 np0005540826 python3.9[52228]: ansible-ansible.legacy.async_status Invoked with jid=j321333549450.51759 mode=status _async_dir=/root/.ansible_async
Dec  1 04:42:51 np0005540826 python3.9[52328]: ansible-ansible.legacy.async_status Invoked with jid=j321333549450.51759 mode=cleanup _async_dir=/root/.ansible_async
Dec  1 04:42:52 np0005540826 python3.9[52480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:53 np0005540826 python3.9[52603]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582172.1978288-927-2004849429764/.source.returncode _original_basename=.hqsfib43 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:54 np0005540826 python3.9[52755]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:54 np0005540826 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:42:54 np0005540826 python3.9[52882]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582173.5515876-975-116032456200893/.source.cfg _original_basename=.247vb4cd follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:55 np0005540826 python3.9[53034]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:42:55 np0005540826 systemd[1]: Reloading Network Manager...
Dec  1 04:42:55 np0005540826 NetworkManager[48989]: <info>  [1764582175.8519] audit: op="reload" arg="0" pid=53038 uid=0 result="success"
Dec  1 04:42:55 np0005540826 NetworkManager[48989]: <info>  [1764582175.8526] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  1 04:42:55 np0005540826 systemd[1]: Reloaded Network Manager.
Dec  1 04:42:56 np0005540826 systemd[1]: session-11.scope: Deactivated successfully.
Dec  1 04:42:56 np0005540826 systemd[1]: session-11.scope: Consumed 50.155s CPU time.
Dec  1 04:42:56 np0005540826 systemd-logind[787]: Session 11 logged out. Waiting for processes to exit.
Dec  1 04:42:56 np0005540826 systemd-logind[787]: Removed session 11.
Dec  1 04:43:01 np0005540826 systemd-logind[787]: New session 12 of user zuul.
Dec  1 04:43:01 np0005540826 systemd[1]: Started Session 12 of User zuul.
Dec  1 04:43:02 np0005540826 python3.9[53222]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:03 np0005540826 python3.9[53376]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:05 np0005540826 python3.9[53570]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:05 np0005540826 systemd[1]: session-12.scope: Deactivated successfully.
Dec  1 04:43:05 np0005540826 systemd[1]: session-12.scope: Consumed 2.535s CPU time.
Dec  1 04:43:05 np0005540826 systemd-logind[787]: Session 12 logged out. Waiting for processes to exit.
Dec  1 04:43:05 np0005540826 systemd-logind[787]: Removed session 12.
Dec  1 04:43:05 np0005540826 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:43:10 np0005540826 systemd-logind[787]: New session 13 of user zuul.
Dec  1 04:43:10 np0005540826 systemd[1]: Started Session 13 of User zuul.
Dec  1 04:43:11 np0005540826 python3.9[53752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:12 np0005540826 python3.9[53906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:14 np0005540826 python3.9[54063]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:14 np0005540826 python3.9[54147]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:16 np0005540826 python3.9[54300]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:18 np0005540826 python3.9[54496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:19 np0005540826 python3.9[54648]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:19 np0005540826 podman[54649]: 2025-12-01 09:43:19.357423104 +0000 UTC m=+0.051795133 system refresh
Dec  1 04:43:20 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:43:20 np0005540826 python3.9[54810]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:21 np0005540826 python3.9[54933]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582199.8133473-198-67793983793482/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a7d10cc62754e58d4e3f9793fa755de4e4589330 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:21 np0005540826 python3.9[55085]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:22 np0005540826 python3.9[55208]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582201.3945963-243-81068422820919/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a92d4bce7d9cad3a31d9a297b9e21f629ee446cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:23 np0005540826 python3.9[55360]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:24 np0005540826 python3.9[55512]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:24 np0005540826 python3.9[55664]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:25 np0005540826 python3.9[55816]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:26 np0005540826 python3.9[55968]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:29 np0005540826 python3.9[56121]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:29 np0005540826 python3.9[56275]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:43:30 np0005540826 python3.9[56427]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:43:31 np0005540826 python3.9[56579]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:32 np0005540826 python3.9[56732]: ansible-service_facts Invoked
Dec  1 04:43:32 np0005540826 network[56749]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:43:32 np0005540826 network[56750]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:43:32 np0005540826 network[56751]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:43:38 np0005540826 python3.9[57203]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:42 np0005540826 python3.9[57356]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:43:43 np0005540826 python3.9[57508]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:44 np0005540826 python3.9[57633]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582223.1859918-675-245468958870910/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:45 np0005540826 python3.9[57787]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:45 np0005540826 python3.9[57912]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582224.735169-721-239102414514305/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:47 np0005540826 python3.9[58066]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:49 np0005540826 python3.9[58220]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:50 np0005540826 python3.9[58304]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:43:52 np0005540826 python3.9[58458]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:53 np0005540826 python3.9[58542]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:43:53 np0005540826 chronyd[790]: chronyd exiting
Dec  1 04:43:53 np0005540826 systemd[1]: Stopping NTP client/server...
Dec  1 04:43:53 np0005540826 systemd[1]: chronyd.service: Deactivated successfully.
Dec  1 04:43:53 np0005540826 systemd[1]: Stopped NTP client/server.
Dec  1 04:43:53 np0005540826 systemd[1]: Starting NTP client/server...
Dec  1 04:43:53 np0005540826 chronyd[58550]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 04:43:53 np0005540826 chronyd[58550]: Frequency -25.342 +/- 0.168 ppm read from /var/lib/chrony/drift
Dec  1 04:43:53 np0005540826 chronyd[58550]: Loaded seccomp filter (level 2)
Dec  1 04:43:53 np0005540826 systemd[1]: Started NTP client/server.
Dec  1 04:43:53 np0005540826 systemd[1]: session-13.scope: Deactivated successfully.
Dec  1 04:43:53 np0005540826 systemd[1]: session-13.scope: Consumed 25.966s CPU time.
Dec  1 04:43:54 np0005540826 systemd-logind[787]: Session 13 logged out. Waiting for processes to exit.
Dec  1 04:43:54 np0005540826 systemd-logind[787]: Removed session 13.
Dec  1 04:44:04 np0005540826 systemd-logind[787]: New session 14 of user zuul.
Dec  1 04:44:04 np0005540826 systemd[1]: Started Session 14 of User zuul.
Dec  1 04:44:05 np0005540826 python3.9[58731]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:06 np0005540826 python3.9[58883]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:07 np0005540826 python3.9[59006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582246.127696-63-246386534110329/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:07 np0005540826 systemd[1]: session-14.scope: Deactivated successfully.
Dec  1 04:44:07 np0005540826 systemd[1]: session-14.scope: Consumed 1.672s CPU time.
Dec  1 04:44:07 np0005540826 systemd-logind[787]: Session 14 logged out. Waiting for processes to exit.
Dec  1 04:44:07 np0005540826 systemd-logind[787]: Removed session 14.
Dec  1 04:44:13 np0005540826 systemd-logind[787]: New session 15 of user zuul.
Dec  1 04:44:13 np0005540826 systemd[1]: Started Session 15 of User zuul.
Dec  1 04:44:14 np0005540826 python3.9[59184]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:44:15 np0005540826 python3.9[59340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:16 np0005540826 python3.9[59515]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:17 np0005540826 python3.9[59638]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764582256.219368-84-245059813705690/.source.json _original_basename=.5mnx8kw7 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:18 np0005540826 python3.9[59790]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:19 np0005540826 python3.9[59913]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582258.0584185-153-232773149969641/.source _original_basename=.513xb31v follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:19 np0005540826 python3.9[60065]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:20 np0005540826 python3.9[60217]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:21 np0005540826 python3.9[60340]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582260.222642-225-92817865479267/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:21 np0005540826 python3.9[60492]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:22 np0005540826 python3.9[60615]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582261.328314-225-274605573372624/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:23 np0005540826 python3.9[60767]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:23 np0005540826 python3.9[60919]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:24 np0005540826 python3.9[61042]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582263.477943-336-242778801273992/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:25 np0005540826 python3.9[61194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:26 np0005540826 python3.9[61317]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582265.3213809-381-174113304233419/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:27 np0005540826 python3.9[61469]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:27 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:27 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:27 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:27 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:27 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:27 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:28 np0005540826 systemd[1]: Starting EDPM Container Shutdown...
Dec  1 04:44:28 np0005540826 systemd[1]: Finished EDPM Container Shutdown.
Dec  1 04:44:28 np0005540826 python3.9[61698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:29 np0005540826 python3.9[61821]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582268.3448722-450-191834036623009/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:30 np0005540826 python3.9[61973]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:30 np0005540826 python3.9[62096]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582269.63454-495-159562369493059/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:31 np0005540826 python3.9[62248]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:31 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:31 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:31 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:31 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:31 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:31 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:32 np0005540826 systemd[1]: Starting Create netns directory...
Dec  1 04:44:32 np0005540826 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:44:32 np0005540826 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:44:32 np0005540826 systemd[1]: Finished Create netns directory.
Dec  1 04:44:32 np0005540826 python3.9[62475]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:44:32 np0005540826 network[62492]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:44:33 np0005540826 network[62493]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:44:33 np0005540826 network[62494]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:44:37 np0005540826 python3.9[62756]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:37 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:37 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:37 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:37 np0005540826 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  1 04:44:37 np0005540826 iptables.init[62797]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  1 04:44:38 np0005540826 iptables.init[62797]: iptables: Flushing firewall rules: [  OK  ]
Dec  1 04:44:38 np0005540826 systemd[1]: iptables.service: Deactivated successfully.
Dec  1 04:44:38 np0005540826 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  1 04:44:38 np0005540826 python3.9[62993]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:39 np0005540826 python3.9[63147]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:39 np0005540826 systemd[1]: Reloading.
Dec  1 04:44:40 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:40 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:40 np0005540826 systemd[1]: Starting Netfilter Tables...
Dec  1 04:44:40 np0005540826 systemd[1]: Finished Netfilter Tables.
Dec  1 04:44:41 np0005540826 python3.9[63340]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:42 np0005540826 python3.9[63493]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:43 np0005540826 python3.9[63618]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582281.8422127-702-225173938440733/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:44 np0005540826 python3.9[63771]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:44:44 np0005540826 systemd[1]: Reloading OpenSSH server daemon...
Dec  1 04:44:44 np0005540826 systemd[1]: Reloaded OpenSSH server daemon.
Dec  1 04:44:45 np0005540826 python3.9[63927]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:45 np0005540826 python3.9[64079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:46 np0005540826 python3.9[64202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582285.280285-795-42813057927344/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:47 np0005540826 python3.9[64354]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:44:47 np0005540826 systemd[1]: Starting Time & Date Service...
Dec  1 04:44:47 np0005540826 systemd[1]: Started Time & Date Service.
Dec  1 04:44:48 np0005540826 python3.9[64510]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:49 np0005540826 python3.9[64662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:49 np0005540826 python3.9[64785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582288.8142982-900-218112040249848/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:50 np0005540826 python3.9[64937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:51 np0005540826 python3.9[65060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582290.2186456-945-228926693464414/.source.yaml _original_basename=.4x9w3f51 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:52 np0005540826 python3.9[65212]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:52 np0005540826 python3.9[65335]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582291.7670195-990-170405153454142/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:53 np0005540826 python3.9[65487]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:54 np0005540826 python3.9[65640]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:55 np0005540826 python3[65793]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:44:55 np0005540826 python3.9[65945]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:56 np0005540826 python3.9[66068]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582295.368256-1107-198979594885512/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:57 np0005540826 python3.9[66220]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:57 np0005540826 python3.9[66343]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582296.8333566-1152-106601933823281/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:58 np0005540826 python3.9[66495]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:59 np0005540826 python3.9[66618]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582298.1582785-1197-185177235838669/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:59 np0005540826 python3.9[66770]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:45:00 np0005540826 python3.9[66893]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582299.4895887-1242-272968101925339/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:01 np0005540826 python3.9[67045]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:45:01 np0005540826 python3.9[67168]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582300.780813-1287-234837467744889/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:02 np0005540826 python3.9[67320]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:03 np0005540826 python3.9[67472]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:04 np0005540826 python3.9[67631]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:05 np0005540826 python3.9[67784]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:05 np0005540826 python3.9[67936]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:06 np0005540826 python3.9[68088]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:45:07 np0005540826 python3.9[68241]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:45:07 np0005540826 systemd[1]: session-15.scope: Deactivated successfully.
Dec  1 04:45:07 np0005540826 systemd[1]: session-15.scope: Consumed 36.227s CPU time.
Dec  1 04:45:07 np0005540826 systemd-logind[787]: Session 15 logged out. Waiting for processes to exit.
Dec  1 04:45:07 np0005540826 systemd-logind[787]: Removed session 15.
Dec  1 04:45:13 np0005540826 systemd-logind[787]: New session 16 of user zuul.
Dec  1 04:45:13 np0005540826 systemd[1]: Started Session 16 of User zuul.
Dec  1 04:45:13 np0005540826 python3.9[68422]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:45:14 np0005540826 python3.9[68574]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:15 np0005540826 python3.9[68726]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:16 np0005540826 python3.9[68878]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=#012 create=True mode=0644 path=/tmp/ansible.bgigik1n state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:17 np0005540826 python3.9[69030]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bgigik1n' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:17 np0005540826 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:45:18 np0005540826 python3.9[69186]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bgigik1n state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:19 np0005540826 systemd[1]: session-16.scope: Deactivated successfully.
Dec  1 04:45:19 np0005540826 systemd[1]: session-16.scope: Consumed 3.683s CPU time.
Dec  1 04:45:19 np0005540826 systemd-logind[787]: Session 16 logged out. Waiting for processes to exit.
Dec  1 04:45:19 np0005540826 systemd-logind[787]: Removed session 16.
Dec  1 04:45:25 np0005540826 systemd-logind[787]: New session 17 of user zuul.
Dec  1 04:45:25 np0005540826 systemd[1]: Started Session 17 of User zuul.
Dec  1 04:45:26 np0005540826 python3.9[69364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:27 np0005540826 python3.9[69520]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:45:29 np0005540826 python3.9[69674]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:45:30 np0005540826 python3.9[69827]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:31 np0005540826 python3.9[69980]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:32 np0005540826 python3.9[70134]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:32 np0005540826 python3.9[70289]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:33 np0005540826 systemd[1]: session-17.scope: Deactivated successfully.
Dec  1 04:45:33 np0005540826 systemd[1]: session-17.scope: Consumed 4.763s CPU time.
Dec  1 04:45:33 np0005540826 systemd-logind[787]: Session 17 logged out. Waiting for processes to exit.
Dec  1 04:45:33 np0005540826 systemd-logind[787]: Removed session 17.
Dec  1 04:45:38 np0005540826 systemd-logind[787]: New session 18 of user zuul.
Dec  1 04:45:38 np0005540826 systemd[1]: Started Session 18 of User zuul.
Dec  1 04:45:39 np0005540826 python3.9[70467]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:40 np0005540826 python3.9[70623]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:45:41 np0005540826 python3.9[70707]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:45:43 np0005540826 python3.9[70858]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:44 np0005540826 python3.9[71009]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:45:45 np0005540826 python3.9[71159]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:45 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:45:46 np0005540826 python3.9[71310]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:47 np0005540826 systemd[1]: session-18.scope: Deactivated successfully.
Dec  1 04:45:47 np0005540826 systemd[1]: session-18.scope: Consumed 6.016s CPU time.
Dec  1 04:45:47 np0005540826 systemd-logind[787]: Session 18 logged out. Waiting for processes to exit.
Dec  1 04:45:47 np0005540826 systemd-logind[787]: Removed session 18.
Dec  1 04:45:55 np0005540826 systemd-logind[787]: New session 19 of user zuul.
Dec  1 04:45:55 np0005540826 systemd[1]: Started Session 19 of User zuul.
Dec  1 04:46:01 np0005540826 python3[72077]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:46:03 np0005540826 chronyd[58550]: Selected source 162.159.200.1 (pool.ntp.org)
Dec  1 04:46:03 np0005540826 python3[72172]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:46:05 np0005540826 python3[72199]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:46:05 np0005540826 python3[72225]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:46:05 np0005540826 kernel: loop: module loaded
Dec  1 04:46:05 np0005540826 kernel: loop3: detected capacity change from 0 to 41943040
Dec  1 04:46:06 np0005540826 python3[72260]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:46:06 np0005540826 lvm[72263]: PV /dev/loop3 not used.
Dec  1 04:46:06 np0005540826 lvm[72272]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:46:06 np0005540826 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec  1 04:46:06 np0005540826 lvm[72274]:  1 logical volume(s) in volume group "ceph_vg0" now active
Dec  1 04:46:06 np0005540826 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec  1 04:46:06 np0005540826 python3[72352]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:46:07 np0005540826 python3[72425]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764582366.6756127-36891-5689294243601/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:46:08 np0005540826 python3[72475]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:46:08 np0005540826 systemd[1]: Reloading.
Dec  1 04:46:08 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:46:08 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:46:08 np0005540826 systemd[1]: Starting Ceph OSD losetup...
Dec  1 04:46:08 np0005540826 bash[72515]: /dev/loop3: [64513]:4327940 (/var/lib/ceph-osd-0.img)
Dec  1 04:46:08 np0005540826 systemd[1]: Finished Ceph OSD losetup.
Dec  1 04:46:08 np0005540826 lvm[72516]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:46:08 np0005540826 lvm[72516]: VG ceph_vg0 finished
Dec  1 04:46:11 np0005540826 python3[72540]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:47:40 np0005540826 systemd-logind[787]: New session 20 of user ceph-admin.
Dec  1 04:47:40 np0005540826 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:47:40 np0005540826 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:47:40 np0005540826 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:47:40 np0005540826 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:47:40 np0005540826 systemd-logind[787]: New session 22 of user ceph-admin.
Dec  1 04:47:40 np0005540826 systemd[72588]: Queued start job for default target Main User Target.
Dec  1 04:47:40 np0005540826 systemd[72588]: Created slice User Application Slice.
Dec  1 04:47:40 np0005540826 systemd[72588]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:47:40 np0005540826 systemd[72588]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:47:40 np0005540826 systemd[72588]: Reached target Paths.
Dec  1 04:47:40 np0005540826 systemd[72588]: Reached target Timers.
Dec  1 04:47:40 np0005540826 systemd[72588]: Starting D-Bus User Message Bus Socket...
Dec  1 04:47:40 np0005540826 systemd[72588]: Starting Create User's Volatile Files and Directories...
Dec  1 04:47:40 np0005540826 systemd[72588]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:47:40 np0005540826 systemd[72588]: Reached target Sockets.
Dec  1 04:47:40 np0005540826 systemd[72588]: Finished Create User's Volatile Files and Directories.
Dec  1 04:47:40 np0005540826 systemd[72588]: Reached target Basic System.
Dec  1 04:47:40 np0005540826 systemd[72588]: Reached target Main User Target.
Dec  1 04:47:40 np0005540826 systemd[72588]: Startup finished in 157ms.
Dec  1 04:47:40 np0005540826 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:47:40 np0005540826 systemd[1]: Started Session 20 of User ceph-admin.
Dec  1 04:47:40 np0005540826 systemd[1]: Started Session 22 of User ceph-admin.
Dec  1 04:47:40 np0005540826 systemd-logind[787]: New session 23 of user ceph-admin.
Dec  1 04:47:40 np0005540826 systemd[1]: Started Session 23 of User ceph-admin.
Dec  1 04:47:41 np0005540826 systemd-logind[787]: New session 24 of user ceph-admin.
Dec  1 04:47:41 np0005540826 systemd[1]: Started Session 24 of User ceph-admin.
Dec  1 04:47:41 np0005540826 systemd-logind[787]: New session 25 of user ceph-admin.
Dec  1 04:47:41 np0005540826 systemd[1]: Started Session 25 of User ceph-admin.
Dec  1 04:47:41 np0005540826 systemd-logind[787]: New session 26 of user ceph-admin.
Dec  1 04:47:41 np0005540826 systemd[1]: Started Session 26 of User ceph-admin.
Dec  1 04:47:42 np0005540826 systemd-logind[787]: New session 27 of user ceph-admin.
Dec  1 04:47:42 np0005540826 systemd[1]: Started Session 27 of User ceph-admin.
Dec  1 04:47:42 np0005540826 systemd-logind[787]: New session 28 of user ceph-admin.
Dec  1 04:47:42 np0005540826 systemd[1]: Started Session 28 of User ceph-admin.
Dec  1 04:47:42 np0005540826 systemd-logind[787]: New session 29 of user ceph-admin.
Dec  1 04:47:43 np0005540826 systemd[1]: Started Session 29 of User ceph-admin.
Dec  1 04:47:43 np0005540826 systemd-logind[787]: New session 30 of user ceph-admin.
Dec  1 04:47:43 np0005540826 systemd[1]: Started Session 30 of User ceph-admin.
Dec  1 04:47:44 np0005540826 systemd-logind[787]: New session 31 of user ceph-admin.
Dec  1 04:47:44 np0005540826 systemd[1]: Started Session 31 of User ceph-admin.
Dec  1 04:47:44 np0005540826 systemd-logind[787]: New session 32 of user ceph-admin.
Dec  1 04:47:44 np0005540826 systemd[1]: Started Session 32 of User ceph-admin.
Dec  1 04:47:45 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:45 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:46 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:46 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:46 np0005540826 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73161 (sysctl)
Dec  1 04:47:46 np0005540826 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  1 04:47:46 np0005540826 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  1 04:47:47 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:47 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:47:50 np0005540826 systemd[1]: var-lib-containers-storage-overlay-compat4091523786-lower\x2dmapped.mount: Deactivated successfully.
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.169312846 +0000 UTC m=+18.234844021 container create 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:06 np0005540826 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3502170512-merged.mount: Deactivated successfully.
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.154990423 +0000 UTC m=+18.220521628 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:06 np0005540826 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  1 04:48:06 np0005540826 systemd[1]: Started libpod-conmon-967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1.scope.
Dec  1 04:48:06 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.250289218 +0000 UTC m=+18.315820473 container init 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.256604028 +0000 UTC m=+18.322135213 container start 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.259923342 +0000 UTC m=+18.325454587 container attach 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:48:06 np0005540826 thirsty_shannon[73419]: 167 167
Dec  1 04:48:06 np0005540826 systemd[1]: libpod-967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1.scope: Deactivated successfully.
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.265626087 +0000 UTC m=+18.331157302 container died 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:48:06 np0005540826 systemd[1]: var-lib-containers-storage-overlay-c1a438c44a7c5338cdbb832654e2f6b81355910b19e6ccb9f5f1cc8e7030ef49-merged.mount: Deactivated successfully.
Dec  1 04:48:06 np0005540826 podman[73339]: 2025-12-01 09:48:06.311855618 +0000 UTC m=+18.377386813 container remove 967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=thirsty_shannon, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Dec  1 04:48:06 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:06 np0005540826 systemd[1]: libpod-conmon-967e21cb7fe2c5ce9ce1c5579afeb05ed199852bba89cd61cc4c122ba6424aa1.scope: Deactivated successfully.
Dec  1 04:48:06 np0005540826 podman[73443]: 2025-12-01 09:48:06.488626698 +0000 UTC m=+0.043155565 container create dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  1 04:48:06 np0005540826 systemd[1]: Started libpod-conmon-dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f.scope.
Dec  1 04:48:06 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:06 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d12d96431b8561a95849b26583b90a3f201b7224f7283d17a54615f699a3820c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:06 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d12d96431b8561a95849b26583b90a3f201b7224f7283d17a54615f699a3820c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:06 np0005540826 podman[73443]: 2025-12-01 09:48:06.468407195 +0000 UTC m=+0.022936082 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:06 np0005540826 podman[73443]: 2025-12-01 09:48:06.569437616 +0000 UTC m=+0.123966473 container init dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  1 04:48:06 np0005540826 podman[73443]: 2025-12-01 09:48:06.576838433 +0000 UTC m=+0.131367290 container start dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  1 04:48:06 np0005540826 podman[73443]: 2025-12-01 09:48:06.580416034 +0000 UTC m=+0.134944901 container attach dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  1 04:48:07 np0005540826 tender_chaum[73459]: [
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:    {
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "available": false,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "being_replaced": false,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "ceph_device_lvm": false,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "lsm_data": {},
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "lvs": [],
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "path": "/dev/sr0",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "rejected_reasons": [
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "Insufficient space (<5GB)",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "Has a FileSystem"
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        ],
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        "sys_api": {
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "actuators": null,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "device_nodes": [
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:                "sr0"
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            ],
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "devname": "sr0",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "human_readable_size": "482.00 KB",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "id_bus": "ata",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "model": "QEMU DVD-ROM",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "nr_requests": "2",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "parent": "/dev/sr0",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "partitions": {},
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "path": "/dev/sr0",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "removable": "1",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "rev": "2.5+",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "ro": "0",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "rotational": "1",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "sas_address": "",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "sas_device_handle": "",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "scheduler_mode": "mq-deadline",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "sectors": 0,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "sectorsize": "2048",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "size": 493568.0,
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "support_discard": "2048",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "type": "disk",
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:            "vendor": "QEMU"
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:        }
Dec  1 04:48:07 np0005540826 tender_chaum[73459]:    }
Dec  1 04:48:07 np0005540826 tender_chaum[73459]: ]
Dec  1 04:48:07 np0005540826 systemd[1]: libpod-dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f.scope: Deactivated successfully.
Dec  1 04:48:07 np0005540826 podman[73443]: 2025-12-01 09:48:07.253942941 +0000 UTC m=+0.808471848 container died dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Dec  1 04:48:07 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d12d96431b8561a95849b26583b90a3f201b7224f7283d17a54615f699a3820c-merged.mount: Deactivated successfully.
Dec  1 04:48:07 np0005540826 podman[73443]: 2025-12-01 09:48:07.301621279 +0000 UTC m=+0.856150136 container remove dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_chaum, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:48:07 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:07 np0005540826 systemd[1]: libpod-conmon-dae1757cef34f21d00e5d3693fb09d786b6c6ae8fea3ddd8d60c4f6b5f9c0a5f.scope: Deactivated successfully.
Dec  1 04:48:10 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:10 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.626261605 +0000 UTC m=+0.044908629 container create 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True)
Dec  1 04:48:10 np0005540826 systemd[1]: Started libpod-conmon-6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d.scope.
Dec  1 04:48:10 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.606014872 +0000 UTC m=+0.024661936 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.709886814 +0000 UTC m=+0.128533848 container init 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.716381518 +0000 UTC m=+0.135028522 container start 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.719994 +0000 UTC m=+0.138641024 container attach 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:10 np0005540826 zealous_heyrovsky[75342]: 167 167
Dec  1 04:48:10 np0005540826 systemd[1]: libpod-6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d.scope: Deactivated successfully.
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.722414601 +0000 UTC m=+0.141061625 container died 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:48:10 np0005540826 podman[75326]: 2025-12-01 09:48:10.75512845 +0000 UTC m=+0.173775454 container remove 6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_heyrovsky, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:48:10 np0005540826 systemd[1]: libpod-conmon-6f334a60fddc8a11c2a084ce909fd1a6b3c2159c2bfd7c04acc7737de5431a4d.scope: Deactivated successfully.
Dec  1 04:48:10 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:10 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:10 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:10 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:11 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:11 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:11 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:11 np0005540826 systemd[1]: Reached target All Ceph clusters and services.
Dec  1 04:48:11 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:11 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:11 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:11 np0005540826 systemd[1]: Reached target Ceph cluster 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:48:11 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:11 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:11 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:11 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:11 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:11 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:12 np0005540826 systemd[1]: Created slice Slice /system/ceph-365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:48:12 np0005540826 systemd[1]: Reached target System Time Set.
Dec  1 04:48:12 np0005540826 systemd[1]: Reached target System Time Synchronized.
Dec  1 04:48:12 np0005540826 systemd[1]: Starting Ceph crash.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:48:12 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:12 np0005540826 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:12 np0005540826 podman[75595]: 2025-12-01 09:48:12.332215342 +0000 UTC m=+0.071922193 container create 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  1 04:48:12 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19caa5ed6d7c9da2ee8084a1c7b7962e103434332d7b2cb5268a1b4b53800a4d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:12 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19caa5ed6d7c9da2ee8084a1c7b7962e103434332d7b2cb5268a1b4b53800a4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:12 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19caa5ed6d7c9da2ee8084a1c7b7962e103434332d7b2cb5268a1b4b53800a4d/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:12 np0005540826 podman[75595]: 2025-12-01 09:48:12.287276234 +0000 UTC m=+0.026983105 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:12 np0005540826 podman[75595]: 2025-12-01 09:48:12.386339264 +0000 UTC m=+0.126046135 container init 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:48:12 np0005540826 podman[75595]: 2025-12-01 09:48:12.391011832 +0000 UTC m=+0.130718683 container start 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:12 np0005540826 bash[75595]: 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89
Dec  1 04:48:12 np0005540826 systemd[1]: Started Ceph crash.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: INFO:ceph-crash:pinging cluster to exercise our key
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.556+0000 7efd3e6f3640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.556+0000 7efd3e6f3640 -1 AuthRegistry(0x7efd38069490) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.557+0000 7efd3e6f3640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.557+0000 7efd3e6f3640 -1 AuthRegistry(0x7efd3e6f1ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.559+0000 7efd37fff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: 2025-12-01T09:48:12.560+0000 7efd3e6f3640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec  1 04:48:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1[75610]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.097290689 +0000 UTC m=+0.035128661 container create bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  1 04:48:13 np0005540826 systemd[1]: Started libpod-conmon-bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2.scope.
Dec  1 04:48:13 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.0819206 +0000 UTC m=+0.019758592 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.226257518 +0000 UTC m=+0.164095550 container init bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid)
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.233507671 +0000 UTC m=+0.171345643 container start bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  1 04:48:13 np0005540826 serene_cannon[75734]: 167 167
Dec  1 04:48:13 np0005540826 systemd[1]: libpod-bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2.scope: Deactivated successfully.
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.335585253 +0000 UTC m=+0.273423275 container attach bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.336549964 +0000 UTC m=+0.274387946 container died bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:13 np0005540826 systemd[1]: var-lib-containers-storage-overlay-7b620d85f5ac69e56df57265ea5aee7c0353c99440843a8020c4ee67fe44767a-merged.mount: Deactivated successfully.
Dec  1 04:48:13 np0005540826 podman[75717]: 2025-12-01 09:48:13.37883944 +0000 UTC m=+0.316677452 container remove bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=serene_cannon, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid)
Dec  1 04:48:13 np0005540826 systemd[1]: libpod-conmon-bdd32fb0f2d69cbd2143f8ff3640c0f6f75dadc31ab217a83081927064ee45c2.scope: Deactivated successfully.
Dec  1 04:48:13 np0005540826 podman[75760]: 2025-12-01 09:48:13.585398986 +0000 UTC m=+0.058197057 container create 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:13 np0005540826 systemd[1]: Started libpod-conmon-47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669.scope.
Dec  1 04:48:13 np0005540826 podman[75760]: 2025-12-01 09:48:13.559378131 +0000 UTC m=+0.032176202 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:13 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:13 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:13 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:13 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:13 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:13 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:13 np0005540826 podman[75760]: 2025-12-01 09:48:13.704420592 +0000 UTC m=+0.177218703 container init 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  1 04:48:13 np0005540826 podman[75760]: 2025-12-01 09:48:13.719117765 +0000 UTC m=+0.191915826 container start 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:13 np0005540826 podman[75760]: 2025-12-01 09:48:13.723916937 +0000 UTC m=+0.196715048 container attach 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: --> passed data devices: 0 physical, 1 LVM
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new a81d93fb-5215-4a2c-87f7-124573e3e396
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:14 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec  1 04:48:14 np0005540826 lvm[75842]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:48:14 np0005540826 lvm[75842]: VG ceph_vg0 finished
Dec  1 04:48:15 np0005540826 frosty_beaver[75776]: stderr: got monmap epoch 1
Dec  1 04:48:15 np0005540826 frosty_beaver[75776]: --> Creating keyring file for osd.0
Dec  1 04:48:15 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec  1 04:48:15 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec  1 04:48:15 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid a81d93fb-5215-4a2c-87f7-124573e3e396 --setuser ceph --setgroup ceph
Dec  1 04:48:17 np0005540826 frosty_beaver[75776]: stderr: 2025-12-01T09:48:15.329+0000 7ffa75acc740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Dec  1 04:48:17 np0005540826 frosty_beaver[75776]: stderr: 2025-12-01T09:48:15.591+0000 7ffa75acc740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec  1 04:48:17 np0005540826 frosty_beaver[75776]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec  1 04:48:17 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:17 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  1 04:48:18 np0005540826 frosty_beaver[75776]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec  1 04:48:18 np0005540826 systemd[1]: libpod-47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669.scope: Deactivated successfully.
Dec  1 04:48:18 np0005540826 systemd[1]: libpod-47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669.scope: Consumed 2.079s CPU time.
Dec  1 04:48:18 np0005540826 podman[76753]: 2025-12-01 09:48:18.349711413 +0000 UTC m=+0.024788288 container died 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:18 np0005540826 systemd[1]: var-lib-containers-storage-overlay-aa85dfd5e5713fcb041b70d182d720922753d5edbe2ba048237e4090808c8ff3-merged.mount: Deactivated successfully.
Dec  1 04:48:18 np0005540826 podman[76753]: 2025-12-01 09:48:18.388493269 +0000 UTC m=+0.063570134 container remove 47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_beaver, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:18 np0005540826 systemd[1]: libpod-conmon-47418faffd383d7e2a9cf48eebfaf4f7107446864a1f1c86b14186805ece9669.scope: Deactivated successfully.
Dec  1 04:48:18 np0005540826 podman[76859]: 2025-12-01 09:48:18.923279553 +0000 UTC m=+0.051153094 container create 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  1 04:48:18 np0005540826 systemd[1]: Started libpod-conmon-1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca.scope.
Dec  1 04:48:18 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:18 np0005540826 podman[76859]: 2025-12-01 09:48:18.905307794 +0000 UTC m=+0.033181315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:19 np0005540826 podman[76859]: 2025-12-01 09:48:19.003710356 +0000 UTC m=+0.131583897 container init 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Dec  1 04:48:19 np0005540826 podman[76859]: 2025-12-01 09:48:19.01094236 +0000 UTC m=+0.138815881 container start 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True)
Dec  1 04:48:19 np0005540826 podman[76859]: 2025-12-01 09:48:19.014731638 +0000 UTC m=+0.142605139 container attach 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:48:19 np0005540826 affectionate_lumiere[76875]: 167 167
Dec  1 04:48:19 np0005540826 systemd[1]: libpod-1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca.scope: Deactivated successfully.
Dec  1 04:48:19 np0005540826 podman[76859]: 2025-12-01 09:48:19.017571866 +0000 UTC m=+0.145445367 container died 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec  1 04:48:19 np0005540826 systemd[1]: var-lib-containers-storage-overlay-8291225c26d36fa8c37a2af6efdfcd2abd2912fe1fd3e5f2055b776095106d3f-merged.mount: Deactivated successfully.
Dec  1 04:48:19 np0005540826 podman[76859]: 2025-12-01 09:48:19.055230652 +0000 UTC m=+0.183104173 container remove 1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_lumiere, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:48:19 np0005540826 systemd[1]: libpod-conmon-1283111e1c12a9e959f521b832298ba0c1f6f8b27f55321f2bc84a6c2e0f02ca.scope: Deactivated successfully.
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.23348902 +0000 UTC m=+0.057672484 container create 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:48:19 np0005540826 systemd[1]: Started libpod-conmon-70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c.scope.
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.201589876 +0000 UTC m=+0.025773430 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:19 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cae9c62ab8ee98db0fca6e0a88210ca7c49b05308ceb4de47da69a2a214cafc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cae9c62ab8ee98db0fca6e0a88210ca7c49b05308ceb4de47da69a2a214cafc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cae9c62ab8ee98db0fca6e0a88210ca7c49b05308ceb4de47da69a2a214cafc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cae9c62ab8ee98db0fca6e0a88210ca7c49b05308ceb4de47da69a2a214cafc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.420798895 +0000 UTC m=+0.244982379 container init 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.434177575 +0000 UTC m=+0.258361059 container start 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.465039694 +0000 UTC m=+0.289223168 container attach 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:19 np0005540826 clever_wiles[76917]: {
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:    "0": [
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:        {
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "devices": [
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "/dev/loop3"
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            ],
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "lv_name": "ceph_lv0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "lv_size": "21470642176",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=18ab0z-2SjB-cLmt-dSvY-Nfin-V9Xg-I8Y5bV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=365f19c2-81e5-5edd-b6b4-280555214d3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=a81d93fb-5215-4a2c-87f7-124573e3e396,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "lv_uuid": "18ab0z-2SjB-cLmt-dSvY-Nfin-V9Xg-I8Y5bV",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "name": "ceph_lv0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "tags": {
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.block_uuid": "18ab0z-2SjB-cLmt-dSvY-Nfin-V9Xg-I8Y5bV",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.cluster_fsid": "365f19c2-81e5-5edd-b6b4-280555214d3a",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.cluster_name": "ceph",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.crush_device_class": "",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.encrypted": "0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.osd_fsid": "a81d93fb-5215-4a2c-87f7-124573e3e396",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.osd_id": "0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.type": "block",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.vdo": "0",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:                "ceph.with_tpm": "0"
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            },
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "type": "block",
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:            "vg_name": "ceph_vg0"
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:        }
Dec  1 04:48:19 np0005540826 clever_wiles[76917]:    ]
Dec  1 04:48:19 np0005540826 clever_wiles[76917]: }
Dec  1 04:48:19 np0005540826 systemd[1]: libpod-70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c.scope: Deactivated successfully.
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.727286528 +0000 UTC m=+0.551470012 container died 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec  1 04:48:19 np0005540826 systemd[1]: var-lib-containers-storage-overlay-6cae9c62ab8ee98db0fca6e0a88210ca7c49b05308ceb4de47da69a2a214cafc-merged.mount: Deactivated successfully.
Dec  1 04:48:19 np0005540826 podman[76900]: 2025-12-01 09:48:19.766874314 +0000 UTC m=+0.591057778 container remove 70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_wiles, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:48:19 np0005540826 systemd[1]: libpod-conmon-70c9f3ae54658063724582cb6fcbd73a3f0ec288cbf90d78dd282f52930b6a8c.scope: Deactivated successfully.
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.312157187 +0000 UTC m=+0.039671473 container create ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default)
Dec  1 04:48:20 np0005540826 systemd[1]: Started libpod-conmon-ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc.scope.
Dec  1 04:48:20 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.392830935 +0000 UTC m=+0.120345231 container init ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.295764548 +0000 UTC m=+0.023278854 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.399885378 +0000 UTC m=+0.127399694 container start ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:48:20 np0005540826 focused_wiles[77041]: 167 167
Dec  1 04:48:20 np0005540826 systemd[1]: libpod-ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc.scope: Deactivated successfully.
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.407776794 +0000 UTC m=+0.135291080 container attach ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.408162958 +0000 UTC m=+0.135677244 container died ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:48:20 np0005540826 systemd[1]: var-lib-containers-storage-overlay-70cdc9a09a35f35018da67b8b3d4add80cc5d0f64dc4f232cd24fc037710dc1c-merged.mount: Deactivated successfully.
Dec  1 04:48:20 np0005540826 podman[77025]: 2025-12-01 09:48:20.49033808 +0000 UTC m=+0.217852366 container remove ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_wiles, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec  1 04:48:20 np0005540826 systemd[1]: libpod-conmon-ba695c1e0273ff89c09ec53bf9019add77183f5db1aefe3dacd875033d9ee9bc.scope: Deactivated successfully.
Dec  1 04:48:20 np0005540826 podman[77073]: 2025-12-01 09:48:20.818811324 +0000 UTC m=+0.051853419 container create ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:48:20 np0005540826 systemd[1]: Started libpod-conmon-ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482.scope.
Dec  1 04:48:20 np0005540826 podman[77073]: 2025-12-01 09:48:20.800261909 +0000 UTC m=+0.033304024 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:20 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:20 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:20 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:20 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:20 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:20 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:20 np0005540826 podman[77073]: 2025-12-01 09:48:20.930105365 +0000 UTC m=+0.163147580 container init ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:48:20 np0005540826 podman[77073]: 2025-12-01 09:48:20.939564329 +0000 UTC m=+0.172606464 container start ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:48:20 np0005540826 podman[77073]: 2025-12-01 09:48:20.943305134 +0000 UTC m=+0.176347229 container attach ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  1 04:48:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test[77089]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec  1 04:48:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test[77089]:                            [--no-systemd] [--no-tmpfs]
Dec  1 04:48:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test[77089]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  1 04:48:21 np0005540826 systemd[1]: libpod-ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482.scope: Deactivated successfully.
Dec  1 04:48:21 np0005540826 podman[77073]: 2025-12-01 09:48:21.116191994 +0000 UTC m=+0.349234179 container died ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:48:21 np0005540826 systemd[1]: var-lib-containers-storage-overlay-05ac977996c152d868f6bfde554dcebf218310a427bb5f3cbc956cbf5d746a2b-merged.mount: Deactivated successfully.
Dec  1 04:48:21 np0005540826 podman[77073]: 2025-12-01 09:48:21.158336672 +0000 UTC m=+0.391378767 container remove ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate-test, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:48:21 np0005540826 systemd[1]: libpod-conmon-ccc079367752945b785c3c7f8568482840e66501d4a5557a0079dc1bcf1c1482.scope: Deactivated successfully.
Dec  1 04:48:21 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:21 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:21 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:21 np0005540826 systemd[1]: Reloading.
Dec  1 04:48:21 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:48:21 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:48:21 np0005540826 systemd[1]: Starting Ceph osd.0 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:48:22 np0005540826 podman[77247]: 2025-12-01 09:48:22.126460966 +0000 UTC m=+0.034449885 container create fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:22 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:22 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:22 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:22 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:22 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:22 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:22 np0005540826 podman[77247]: 2025-12-01 09:48:22.194989601 +0000 UTC m=+0.102978620 container init fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec  1 04:48:22 np0005540826 podman[77247]: 2025-12-01 09:48:22.202387246 +0000 UTC m=+0.110376215 container start fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:48:22 np0005540826 podman[77247]: 2025-12-01 09:48:22.207519308 +0000 UTC m=+0.115508257 container attach fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  1 04:48:22 np0005540826 podman[77247]: 2025-12-01 09:48:22.110551667 +0000 UTC m=+0.018540616 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 bash[77247]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 bash[77247]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 lvm[77343]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:48:22 np0005540826 lvm[77343]: VG ceph_vg0 finished
Dec  1 04:48:22 np0005540826 lvm[77347]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:48:22 np0005540826 lvm[77347]: VG ceph_vg0 finished
Dec  1 04:48:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  1 04:48:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 bash[77247]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  1 04:48:22 np0005540826 bash[77247]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:22 np0005540826 bash[77247]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:23 np0005540826 bash[77247]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec  1 04:48:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate[77262]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  1 04:48:23 np0005540826 bash[77247]: --> ceph-volume lvm activate successful for osd ID: 0
Dec  1 04:48:23 np0005540826 systemd[1]: libpod-fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2.scope: Deactivated successfully.
Dec  1 04:48:23 np0005540826 podman[77247]: 2025-12-01 09:48:23.374642932 +0000 UTC m=+1.282631861 container died fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:48:23 np0005540826 systemd[1]: libpod-fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2.scope: Consumed 1.203s CPU time.
Dec  1 04:48:23 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d9aa30df895e48e196c48089c796e374b50f824a9b23447285a5c3f698f4442e-merged.mount: Deactivated successfully.
Dec  1 04:48:23 np0005540826 podman[77247]: 2025-12-01 09:48:23.428219778 +0000 UTC m=+1.336208707 container remove fbcab71089deb72860271a17b1b33c4478aafdc511e8eb622963f463fc13cfc2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec  1 04:48:23 np0005540826 podman[77506]: 2025-12-01 09:48:23.638661977 +0000 UTC m=+0.044059169 container create 37acc53e309fad0d25d28516a11ce4dd43bcac4e1123466f33bb91e822cb9d56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 04:48:23 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42cd1e20ce3d12b1f7bac6f7ca1f79d3d04e139e79e85ac5336811d7149a8bda/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:23 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42cd1e20ce3d12b1f7bac6f7ca1f79d3d04e139e79e85ac5336811d7149a8bda/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:23 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42cd1e20ce3d12b1f7bac6f7ca1f79d3d04e139e79e85ac5336811d7149a8bda/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:23 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42cd1e20ce3d12b1f7bac6f7ca1f79d3d04e139e79e85ac5336811d7149a8bda/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:23 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42cd1e20ce3d12b1f7bac6f7ca1f79d3d04e139e79e85ac5336811d7149a8bda/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:23 np0005540826 podman[77506]: 2025-12-01 09:48:23.713190129 +0000 UTC m=+0.118587321 container init 37acc53e309fad0d25d28516a11ce4dd43bcac4e1123466f33bb91e822cb9d56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec  1 04:48:23 np0005540826 podman[77506]: 2025-12-01 09:48:23.62104301 +0000 UTC m=+0.026440192 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:23 np0005540826 podman[77506]: 2025-12-01 09:48:23.718563866 +0000 UTC m=+0.123961048 container start 37acc53e309fad0d25d28516a11ce4dd43bcac4e1123466f33bb91e822cb9d56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid)
Dec  1 04:48:23 np0005540826 bash[77506]: 37acc53e309fad0d25d28516a11ce4dd43bcac4e1123466f33bb91e822cb9d56
Dec  1 04:48:23 np0005540826 systemd[1]: Started Ceph osd.0 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: pidfile_write: ignore empty --pid-file
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:23 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 podman[77631]: 2025-12-01 09:48:24.281327637 +0000 UTC m=+0.036372756 container create 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  1 04:48:24 np0005540826 systemd[1]: Started libpod-conmon-2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4.scope.
Dec  1 04:48:24 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:24 np0005540826 podman[77631]: 2025-12-01 09:48:24.266814936 +0000 UTC m=+0.021860075 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:24 np0005540826 podman[77631]: 2025-12-01 09:48:24.377503309 +0000 UTC m=+0.132548508 container init 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:48:24 np0005540826 podman[77631]: 2025-12-01 09:48:24.389790411 +0000 UTC m=+0.144835570 container start 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 podman[77631]: 2025-12-01 09:48:24.394318395 +0000 UTC m=+0.149363554 container attach 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  1 04:48:24 np0005540826 peaceful_robinson[77647]: 167 167
Dec  1 04:48:24 np0005540826 systemd[1]: libpod-2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4.scope: Deactivated successfully.
Dec  1 04:48:24 np0005540826 conmon[77647]: conmon 2ebe7003eaad555cc835 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4.scope/container/memory.events
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667c00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 podman[77654]: 2025-12-01 09:48:24.455124265 +0000 UTC m=+0.034500089 container died 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:48:24 np0005540826 systemd[1]: var-lib-containers-storage-overlay-4b27e95d6611d736617ad406a9ef0761b31a2755662c3f92285df5c13da898f7-merged.mount: Deactivated successfully.
Dec  1 04:48:24 np0005540826 podman[77654]: 2025-12-01 09:48:24.495674882 +0000 UTC m=+0.075050716 container remove 2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_robinson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:48:24 np0005540826 systemd[1]: libpod-conmon-2ebe7003eaad555cc835e7ae355ad91d6eda81256c5c327eddc70696dd4ebcf4.scope: Deactivated successfully.
Dec  1 04:48:24 np0005540826 podman[77679]: 2025-12-01 09:48:24.657535649 +0000 UTC m=+0.041623955 container create b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b3667800 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:24 np0005540826 systemd[1]: Started libpod-conmon-b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25.scope.
Dec  1 04:48:24 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:24 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48bc1c3be9af517aeb9ee1c62ec0ec0ce6c7236378264ffbd50ba52aa78970b1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:24 np0005540826 podman[77679]: 2025-12-01 09:48:24.638061416 +0000 UTC m=+0.022149742 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:24 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48bc1c3be9af517aeb9ee1c62ec0ec0ce6c7236378264ffbd50ba52aa78970b1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:24 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48bc1c3be9af517aeb9ee1c62ec0ec0ce6c7236378264ffbd50ba52aa78970b1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:24 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48bc1c3be9af517aeb9ee1c62ec0ec0ce6c7236378264ffbd50ba52aa78970b1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:24 np0005540826 podman[77679]: 2025-12-01 09:48:24.747093165 +0000 UTC m=+0.131181461 container init b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:48:24 np0005540826 podman[77679]: 2025-12-01 09:48:24.753335067 +0000 UTC m=+0.137423363 container start b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:24 np0005540826 podman[77679]: 2025-12-01 09:48:24.75623875 +0000 UTC m=+0.140327096 container attach b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: load: jerasure load: lrc 
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:48:24 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:25 np0005540826 lvm[77779]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:48:25 np0005540826 lvm[77779]: VG ceph_vg0 finished
Dec  1 04:48:25 np0005540826 stoic_galileo[77697]: {}
Dec  1 04:48:25 np0005540826 systemd[1]: libpod-b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25.scope: Deactivated successfully.
Dec  1 04:48:25 np0005540826 systemd[1]: libpod-b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25.scope: Consumed 1.069s CPU time.
Dec  1 04:48:25 np0005540826 podman[77679]: 2025-12-01 09:48:25.451269989 +0000 UTC m=+0.835358285 container died b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default)
Dec  1 04:48:25 np0005540826 systemd[1]: var-lib-containers-storage-overlay-48bc1c3be9af517aeb9ee1c62ec0ec0ce6c7236378264ffbd50ba52aa78970b1-merged.mount: Deactivated successfully.
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:25 np0005540826 podman[77679]: 2025-12-01 09:48:25.498155005 +0000 UTC m=+0.882243301 container remove b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stoic_galileo, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:48:25 np0005540826 systemd[1]: libpod-conmon-b19a8ad78d17b8bc41314cfd90d951f44b9792500495bf0dc55faf61792f3a25.scope: Deactivated successfully.
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:25 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450cc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount shared_bdev_used = 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Git sha 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DB SUMMARY
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DB Session ID:  H3UVPN3L425TCTEDJE2Z
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                     Options.env: 0x55e0b44dddc0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                Options.info_log: 0x55e0b44e17a0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.write_buffer_manager: 0x55e0b45d6a00
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.row_cache: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.wal_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.wal_compression: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Compression algorithms supported:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZSTD supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1cb84ae6-32e7-4024-868e-161bc4aff208
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506335052, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506335316, "job": 1, "event": "recovery_finished"}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: freelist init
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: freelist _read_cfg
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs umount
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) close
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bdev(0x55e0b450d000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluefs mount shared_bdev_used = 4718592
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Git sha 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DB SUMMARY
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DB Session ID:  H3UVPN3L425TCTEDJE2Y
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                     Options.env: 0x55e0b467a310
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                Options.info_log: 0x55e0b44e1940
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.write_buffer_manager: 0x55e0b45d6a00
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.row_cache: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                              Options.wal_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.wal_compression: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Compression algorithms supported:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZSTD supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:           Options.merge_operator: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e0b44e1ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e0b36fc9b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.compression: LZ4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.num_levels: 7
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1cb84ae6-32e7-4024-868e-161bc4aff208
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506579328, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506584398, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582506, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cb84ae6-32e7-4024-868e-161bc4aff208", "db_session_id": "H3UVPN3L425TCTEDJE2Y", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506587531, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582506, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cb84ae6-32e7-4024-868e-161bc4aff208", "db_session_id": "H3UVPN3L425TCTEDJE2Y", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506590384, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582506, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1cb84ae6-32e7-4024-868e-161bc4aff208", "db_session_id": "H3UVPN3L425TCTEDJE2Y", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582506591665, "job": 1, "event": "recovery_finished"}
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e0b46a8000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: DB pointer 0x55e0b4688000
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 460.80 MB usage: 0.94 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 460.80 MB usage: 0.94 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Bloc
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: _get_class not permitted to load lua
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: _get_class not permitted to load sdk
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 load_pgs
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 load_pgs opened 0 pgs
Dec  1 04:48:26 np0005540826 ceph-osd[77525]: osd.0 0 log_to_monitors true
Dec  1 04:48:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0[77521]: 2025-12-01T09:48:26.613+0000 7f24844d3740 -1 osd.0 0 log_to_monitors true
Dec  1 04:48:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  1 04:48:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  1 04:48:27 np0005540826 podman[78355]: 2025-12-01 09:48:27.817583232 +0000 UTC m=+0.055594493 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:27 np0005540826 podman[78355]: 2025-12-01 09:48:27.918393085 +0000 UTC m=+0.156404346 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.510127196 +0000 UTC m=+0.039033203 container create b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True)
Dec  1 04:48:28 np0005540826 systemd[1]: Started libpod-conmon-b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4.scope.
Dec  1 04:48:28 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.581287997 +0000 UTC m=+0.110194024 container init b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 done with init, starting boot process
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 start_boot
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  1 04:48:28 np0005540826 ceph-osd[77525]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.491963115 +0000 UTC m=+0.020869122 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.590037496 +0000 UTC m=+0.118943493 container start b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.594016446 +0000 UTC m=+0.122922453 container attach b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  1 04:48:28 np0005540826 intelligent_joliot[78513]: 167 167
Dec  1 04:48:28 np0005540826 systemd[1]: libpod-b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4.scope: Deactivated successfully.
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.596707285 +0000 UTC m=+0.125613272 container died b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:28 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d33af2013220c629096bb80f352a63cc8db3299ecea19c369594d213d62bf43e-merged.mount: Deactivated successfully.
Dec  1 04:48:28 np0005540826 podman[78497]: 2025-12-01 09:48:28.677853512 +0000 UTC m=+0.206759539 container remove b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_joliot, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:48:28 np0005540826 systemd[1]: libpod-conmon-b85c60566167e7fed3385656f96773c6e21fccc09c0647b2c60594957adbf3f4.scope: Deactivated successfully.
Dec  1 04:48:28 np0005540826 podman[78536]: 2025-12-01 09:48:28.86642963 +0000 UTC m=+0.071434811 container create 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:48:28 np0005540826 podman[78536]: 2025-12-01 09:48:28.822897857 +0000 UTC m=+0.027903108 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:48:28 np0005540826 systemd[1]: Started libpod-conmon-71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0.scope.
Dec  1 04:48:28 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:48:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adab76a970a15815510e35f1d9ef24de3be76e3ddeb9e8432045906de1b0c2d5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adab76a970a15815510e35f1d9ef24de3be76e3ddeb9e8432045906de1b0c2d5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adab76a970a15815510e35f1d9ef24de3be76e3ddeb9e8432045906de1b0c2d5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adab76a970a15815510e35f1d9ef24de3be76e3ddeb9e8432045906de1b0c2d5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:48:29 np0005540826 podman[78536]: 2025-12-01 09:48:29.002825397 +0000 UTC m=+0.207830558 container init 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:48:29 np0005540826 podman[78536]: 2025-12-01 09:48:29.014719668 +0000 UTC m=+0.219724829 container start 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:48:29 np0005540826 podman[78536]: 2025-12-01 09:48:29.035869284 +0000 UTC m=+0.240874455 container attach 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]: [
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:    {
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "available": false,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "being_replaced": false,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "ceph_device_lvm": false,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "lsm_data": {},
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "lvs": [],
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "path": "/dev/sr0",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "rejected_reasons": [
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "Has a FileSystem",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "Insufficient space (<5GB)"
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        ],
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        "sys_api": {
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "actuators": null,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "device_nodes": [
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:                "sr0"
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            ],
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "devname": "sr0",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "human_readable_size": "482.00 KB",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "id_bus": "ata",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "model": "QEMU DVD-ROM",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "nr_requests": "2",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "parent": "/dev/sr0",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "partitions": {},
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "path": "/dev/sr0",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "removable": "1",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "rev": "2.5+",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "ro": "0",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "rotational": "1",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "sas_address": "",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "sas_device_handle": "",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "scheduler_mode": "mq-deadline",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "sectors": 0,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "sectorsize": "2048",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "size": 493568.0,
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "support_discard": "2048",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "type": "disk",
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:            "vendor": "QEMU"
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:        }
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]:    }
Dec  1 04:48:29 np0005540826 pedantic_wilbur[78552]: ]
Dec  1 04:48:29 np0005540826 systemd[1]: libpod-71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0.scope: Deactivated successfully.
Dec  1 04:48:29 np0005540826 podman[78536]: 2025-12-01 09:48:29.751229103 +0000 UTC m=+0.956234284 container died 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:48:29 np0005540826 systemd[1]: var-lib-containers-storage-overlay-adab76a970a15815510e35f1d9ef24de3be76e3ddeb9e8432045906de1b0c2d5-merged.mount: Deactivated successfully.
Dec  1 04:48:30 np0005540826 podman[78536]: 2025-12-01 09:48:30.048198499 +0000 UTC m=+1.253203660 container remove 71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_wilbur, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec  1 04:48:30 np0005540826 systemd[1]: libpod-conmon-71463b514302714686018d1ebea53d635ee321c5540b952768f3006b05b9eda0.scope: Deactivated successfully.
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 27.252 iops: 6976.417 elapsed_sec: 0.430
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [WRN] : OSD bench result of 6976.416651 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 0 waiting for initial osdmap
Dec  1 04:48:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0[77521]: 2025-12-01T09:48:33.107+0000 7f2480c69640 -1 osd.0 0 waiting for initial osdmap
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 check_osdmap_features require_osd_release unknown -> squid
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:48:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-0[77521]: 2025-12-01T09:48:33.140+0000 7f247ba7e640 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 set_numa_affinity not setting numa affinity
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec  1 04:48:33 np0005540826 ceph-osd[77525]: osd.0 9 state: booting -> active
Dec  1 04:48:37 np0005540826 ceph-osd[77525]: osd.0 12 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  1 04:48:37 np0005540826 ceph-osd[77525]: osd.0 12 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec  1 04:48:37 np0005540826 ceph-osd[77525]: osd.0 12 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  1 04:48:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 16 pg[3.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:48:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 17 pg[3.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 18 pg[4.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 19 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 19 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:02 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 20 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:03 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 21 pg[6.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:05 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 22 pg[6.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:14 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=29 pruub=8.961617470s) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active pruub 56.979534149s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:14 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=29 pruub=8.961617470s) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown pruub 56.979534149s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.18( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.19( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.17( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.16( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.14( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.13( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.12( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.11( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.10( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.f( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.c( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.d( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.e( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.b( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.a( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.7( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=30 pruub=10.343406677s) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active pruub 59.284049988s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.5( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.6( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.2( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.3( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.4( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.9( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.8( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1a( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1b( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1c( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1d( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1e( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1f( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.15( empty local-lis/les=16/17 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=30 pruub=10.343406677s) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown pruub 59.284049988s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.19( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.14( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.13( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.17( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.16( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.11( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.f( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.10( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.12( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.18( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.e( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.7( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.6( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.2( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.5( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.3( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.4( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.9( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.8( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1f( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.1e( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:15 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 30 pg[3.15( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=16/16 les/c/f=17/17/0 sis=29) [0] r=0 lpr=29 pi=[16,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1e( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1f( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.10( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.11( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.12( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.13( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.14( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.15( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.17( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.8( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.9( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.16( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.b( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.c( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.a( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.d( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.7( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.2( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.6( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.5( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.4( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.3( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.f( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1d( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1c( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1b( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.e( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1a( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.18( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.19( empty local-lis/les=18/19 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1e( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1f( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.11( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.12( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.10( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.13( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.14( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.15( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.8( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.17( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.9( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.b( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.0( empty local-lis/les=30/31 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.16( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.7( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.2( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.6( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.5( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.f( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.3( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1b( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.e( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.4( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.18( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.1a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:16 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 31 pg[4.19( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=18/18 les/c/f=19/19/0 sis=30) [0] r=0 lpr=30 pi=[18,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec  1 04:49:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec  1 04:49:17 np0005540826 podman[79794]: 2025-12-01 09:49:17.926729637 +0000 UTC m=+0.072044736 container create 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  1 04:49:17 np0005540826 podman[79794]: 2025-12-01 09:49:17.875614252 +0000 UTC m=+0.020929311 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:18 np0005540826 systemd[1]: Started libpod-conmon-4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7.scope.
Dec  1 04:49:18 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:49:18 np0005540826 podman[79794]: 2025-12-01 09:49:18.125292726 +0000 UTC m=+0.270607805 container init 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Dec  1 04:49:18 np0005540826 podman[79794]: 2025-12-01 09:49:18.133194498 +0000 UTC m=+0.278509557 container start 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 04:49:18 np0005540826 podman[79794]: 2025-12-01 09:49:18.13628954 +0000 UTC m=+0.281604609 container attach 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:49:18 np0005540826 amazing_goodall[79810]: 167 167
Dec  1 04:49:18 np0005540826 systemd[1]: libpod-4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7.scope: Deactivated successfully.
Dec  1 04:49:18 np0005540826 podman[79794]: 2025-12-01 09:49:18.139924968 +0000 UTC m=+0.285240037 container died 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:18 np0005540826 systemd[1]: var-lib-containers-storage-overlay-a27f71b894a6eef1ca7a4f95dd8f62ee55cc5544ccf00928ed8132ac01bb9027-merged.mount: Deactivated successfully.
Dec  1 04:49:18 np0005540826 podman[79794]: 2025-12-01 09:49:18.180279418 +0000 UTC m=+0.325594517 container remove 4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goodall, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:49:18 np0005540826 systemd[1]: libpod-conmon-4eeb84913b850d0be5579f28cc633035a463fa1e974e8316e59d8c81213cf9e7.scope: Deactivated successfully.
Dec  1 04:49:18 np0005540826 podman[79827]: 2025-12-01 09:49:18.255725588 +0000 UTC m=+0.049386903 container create 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:18 np0005540826 systemd[1]: Started libpod-conmon-01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5.scope.
Dec  1 04:49:18 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:49:18 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eba458d261f3aee54c1899e59bbe82b2ee91dab23bba54b2ffa12cb26b76cdc/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:18 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eba458d261f3aee54c1899e59bbe82b2ee91dab23bba54b2ffa12cb26b76cdc/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:18 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eba458d261f3aee54c1899e59bbe82b2ee91dab23bba54b2ffa12cb26b76cdc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:18 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eba458d261f3aee54c1899e59bbe82b2ee91dab23bba54b2ffa12cb26b76cdc/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:18 np0005540826 podman[79827]: 2025-12-01 09:49:18.234954952 +0000 UTC m=+0.028616267 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:18 np0005540826 podman[79827]: 2025-12-01 09:49:18.343961121 +0000 UTC m=+0.137622456 container init 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:18 np0005540826 podman[79827]: 2025-12-01 09:49:18.355379907 +0000 UTC m=+0.149041242 container start 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 04:49:18 np0005540826 podman[79827]: 2025-12-01 09:49:18.359361743 +0000 UTC m=+0.153023058 container attach 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:49:18 np0005540826 systemd[1]: libpod-01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5.scope: Deactivated successfully.
Dec  1 04:49:18 np0005540826 podman[79869]: 2025-12-01 09:49:18.47950259 +0000 UTC m=+0.023625943 container died 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 04:49:18 np0005540826 systemd[1]: var-lib-containers-storage-overlay-9eba458d261f3aee54c1899e59bbe82b2ee91dab23bba54b2ffa12cb26b76cdc-merged.mount: Deactivated successfully.
Dec  1 04:49:18 np0005540826 podman[79869]: 2025-12-01 09:49:18.508435575 +0000 UTC m=+0.052558908 container remove 01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jovial_wu, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:18 np0005540826 systemd[1]: libpod-conmon-01691aeab7de733815f3bb9a1b89257756a253ae997cf998b080c4aba14812f5.scope: Deactivated successfully.
Dec  1 04:49:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec  1 04:49:18 np0005540826 systemd[1]: Reloading.
Dec  1 04:49:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec  1 04:49:18 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:18 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:18 np0005540826 systemd[1]: Reloading.
Dec  1 04:49:18 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:18 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:19 np0005540826 systemd[1]: Starting Ceph mon.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:19 np0005540826 podman[80007]: 2025-12-01 09:49:19.362672248 +0000 UTC m=+0.049699632 container create 7505fa15a86ea156c46055cff557584ec61f6df8f2cd50afcbe9ed04fcde498c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-1, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66d30437afb669fd2fd1380913e3e2608cc76e6364c33a394beea1ea9a75a9c4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66d30437afb669fd2fd1380913e3e2608cc76e6364c33a394beea1ea9a75a9c4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66d30437afb669fd2fd1380913e3e2608cc76e6364c33a394beea1ea9a75a9c4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66d30437afb669fd2fd1380913e3e2608cc76e6364c33a394beea1ea9a75a9c4/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:19 np0005540826 podman[80007]: 2025-12-01 09:49:19.342021195 +0000 UTC m=+0.029048639 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:19 np0005540826 podman[80007]: 2025-12-01 09:49:19.44155471 +0000 UTC m=+0.128582124 container init 7505fa15a86ea156c46055cff557584ec61f6df8f2cd50afcbe9ed04fcde498c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-1, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  1 04:49:19 np0005540826 podman[80007]: 2025-12-01 09:49:19.453024417 +0000 UTC m=+0.140051811 container start 7505fa15a86ea156c46055cff557584ec61f6df8f2cd50afcbe9ed04fcde498c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:49:19 np0005540826 bash[80007]: 7505fa15a86ea156c46055cff557584ec61f6df8f2cd50afcbe9ed04fcde498c
Dec  1 04:49:19 np0005540826 systemd[1]: Started Ceph mon.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: load: jerasure load: lrc 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Git sha 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: DB SUMMARY
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: DB Session ID:  KSHNHU57VR1LHZ6EZUM4
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                                     Options.env: 0x55d42f1f3c20
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                                Options.info_log: 0x55d4317b4e40
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                                 Options.wal_dir: 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                    Options.write_buffer_manager: 0x55d4317b9900
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                               Options.row_cache: None
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                              Options.wal_filter: None
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.wal_compression: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.max_background_jobs: 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.max_total_wal_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:       Options.compaction_readahead_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Compression algorithms supported:
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kZSTD supported: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:           Options.merge_operator: 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d4317b4700)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d4317d9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:        Options.write_buffer_size: 33554432
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:  Options.max_write_buffer_number: 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.compression: NoCompression
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d6e1f97e-eb58-41c1-b758-cb672eabd75e
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582559502619, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582559504391, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582559504527, "job": 1, "event": "recovery_finished"}
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d4317dae00
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: DB pointer 0x55d4318e4000
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d4317d9350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.64 KB,0.00012219%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(???) e0 preinit fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec  1 04:49:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec  1 04:49:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).mds e1 new map
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-12-01T09:46:50:475394+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3828223939' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Deploying daemon mon.compute-2 on compute-2
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:19 np0005540826 ceph-mon[80026]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec  1 04:49:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec  1 04:49:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec  1 04:49:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts
Dec  1 04:49:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.11 deep-scrub ok
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.19( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.15( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.13( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.10( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.e( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.c( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.d( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.a( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.1( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.6( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.4( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.9( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.1b( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.1f( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[2.1e( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.648466110s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956504822s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.648443222s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956504822s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.641237259s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949462891s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.641219139s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949462891s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.648180962s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956527710s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.648171425s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956527710s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.641048431s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949462891s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.641037941s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949462891s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647929192s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956459045s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647916794s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956459045s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640820503s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949447632s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.1a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640810966s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949447632s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647756577s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956474304s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647747040s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956474304s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.9( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640616417s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949409485s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.9( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640607834s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949409485s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.3( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640268326s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949211121s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.3( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.640258789s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949211121s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647382736s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956398010s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.647374153s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956398010s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.5( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.639948845s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949073792s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.5( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.639939308s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949073792s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[6.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=32 pruub=14.373450279s) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active pruub 70.682662964s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[6.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=32 pruub=14.373450279s) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown pruub 70.682662964s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.645117760s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956428528s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.645098686s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956428528s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=32 pruub=11.999377251s) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active pruub 68.310798645s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.644155502s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956275940s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.644074440s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956275940s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636708260s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948997498s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.644007683s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956306458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643986702s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956306458s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.a( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636678696s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948997498s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636301041s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948837280s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643692970s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956253052s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.c( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636275291s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948837280s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643675804s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956253052s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636133194s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948829651s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.d( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636118889s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948829651s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=32 pruub=11.999377251s) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown pruub 68.310798645s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643444061s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956207275s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643422127s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956207275s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.e( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636075974s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948905945s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.e( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.636057854s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948905945s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.f( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635717392s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948799133s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643071175s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956176758s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.f( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635694504s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948799133s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.643044472s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956176758s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.10( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635479927s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948806763s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.10( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635458946s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948806763s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.642662048s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956176758s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.642645836s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956176758s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.11( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635169029s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948776245s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.642497063s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956146240s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.11( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635141373s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948776245s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.642482758s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956146240s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.13( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635033607s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948684692s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.13( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.634982109s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948684692s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.14( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.634825706s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948669434s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.14( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.634799957s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948669434s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.15( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635590553s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.949523926s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.15( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.635569572s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.949523926s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.16( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.634659767s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 active pruub 64.948677063s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[3.16( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=32 pruub=8.634639740s) [1] r=-1 lpr=32 pi=[29,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.948677063s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.641465187s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 active pruub 65.956062317s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=32 pruub=9.641440392s) [1] r=-1 lpr=32 pi=[30,32)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 65.956062317s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec  1 04:49:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec  1 04:49:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec  1 04:49:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec  1 04:49:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.19 deep-scrub starts
Dec  1 04:49:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.19 deep-scrub ok
Dec  1 04:49:25 np0005540826 ceph-mon[80026]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Dec  1 04:49:25 np0005540826 ceph-mon[80026]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Dec  1 04:49:25 np0005540826 ceph-mon[80026]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Dec  1 04:49:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec  1 04:49:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec  1 04:49:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec  1 04:49:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec  1 04:49:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec  1 04:49:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e32 e32: 2 total, 2 up, 2 in
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: Deploying daemon mon.compute-1 on compute-1
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-0 calling monitor election
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-2 calling monitor election
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: Health detail: HEALTH_WARN 2 pool(s) do not have an application enabled
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: [WRN] POOL_APP_NOT_ENABLED: 2 pool(s) do not have an application enabled
Dec  1 04:49:29 np0005540826 ceph-mon[80026]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  1 04:49:29 np0005540826 ceph-mon[80026]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  1 04:49:29 np0005540826 ceph-mon[80026]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864324,os=Linux}
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e33 e33: 2 total, 2 up, 2 in
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-0 calling monitor election
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-2 calling monitor election
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec  1 04:49:29 np0005540826 ceph-mon[80026]: overall HEALTH_OK
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1a( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.19( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1b( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.18( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.18( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.19( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1b( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1a( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1e( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1d( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1f( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1c( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.c( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.f( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.e( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.d( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.2( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.5( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.6( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.4( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.7( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.4( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.7( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.3( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.3( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.6( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.5( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.2( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.c( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.f( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.e( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.a( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.9( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.8( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.9( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.a( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.15( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.b( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.16( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.14( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.17( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.17( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.14( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.16( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.15( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.11( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.10( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.13( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.10( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.12( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1d( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1c( empty local-lis/les=21/22 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1e( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1f( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.11( empty local-lis/les=19/20 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.19( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.e( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.13( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.15( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.c( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.d( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.a( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.1( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.6( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.4( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.9( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.1e( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.1b( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.1f( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.19( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.18( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.19( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[2.10( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=32) [0] r=0 lpr=32 pi=[28,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1f( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.c( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.d( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.5( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.6( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.7( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.4( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.0( empty local-lis/les=32/33 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.0( empty local-lis/les=32/33 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.3( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.3( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.6( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.f( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.c( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.a( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.5( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.2( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.9( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.15( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.8( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.14( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.17( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.17( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.14( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.16( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.11( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.10( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.13( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.12( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1d( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[6.1c( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=21/21 les/c/f=22/22/0 sis=32) [0] r=0 lpr=32 pi=[21,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.1e( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:29 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 33 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=19/19 les/c/f=20/20/0 sis=32) [0] r=0 lpr=32 pi=[19,32)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.366255284 +0000 UTC m=+0.052764394 container create 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:30 np0005540826 systemd[1]: Started libpod-conmon-7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35.scope.
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.339891498 +0000 UTC m=+0.026400628 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:30 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.48825889 +0000 UTC m=+0.174768080 container init 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.500806256 +0000 UTC m=+0.187315366 container start 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:30 np0005540826 zealous_franklin[80171]: 167 167
Dec  1 04:49:30 np0005540826 systemd[1]: libpod-7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35.scope: Deactivated successfully.
Dec  1 04:49:30 np0005540826 conmon[80171]: conmon 7bc088ed23c37382ee15 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35.scope/container/memory.events
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.511801931 +0000 UTC m=+0.198311141 container attach 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.512647633 +0000 UTC m=+0.199156743 container died 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:30 np0005540826 systemd[1]: var-lib-containers-storage-overlay-6b696246f19d98a3ec27e8b7165392401f9cb58a0c29598ead66a949db0c993f-merged.mount: Deactivated successfully.
Dec  1 04:49:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec  1 04:49:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec  1 04:49:30 np0005540826 podman[80155]: 2025-12-01 09:49:30.621268742 +0000 UTC m=+0.307777862 container remove 7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Dec  1 04:49:30 np0005540826 systemd[1]: libpod-conmon-7bc088ed23c37382ee159287cca9438576167601d681ebd18ded8f71431d2f35.scope: Deactivated successfully.
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: mon.compute-1 calling monitor election
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: Deploying daemon mgr.compute-1.ymizfm on compute-1
Dec  1 04:49:30 np0005540826 systemd[1]: Reloading.
Dec  1 04:49:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e34 e34: 2 total, 2 up, 2 in
Dec  1 04:49:30 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:30 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:31 np0005540826 systemd[1]: Reloading.
Dec  1 04:49:31 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:31 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:31 np0005540826 systemd[1]: Starting Ceph mgr.compute-1.ymizfm for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:31 np0005540826 podman[80314]: 2025-12-01 09:49:31.553778239 +0000 UTC m=+0.037181076 container create 39c0da9ad64f7dc8899b0c294ec018ba6b37cc1c997f080b16312079963025ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  1 04:49:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.3 deep-scrub starts
Dec  1 04:49:31 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1ee1bb90e2e5801f1fe287e62b0b38101cd61c03a0fd58d994bf8e1d09a7d64/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:31 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1ee1bb90e2e5801f1fe287e62b0b38101cd61c03a0fd58d994bf8e1d09a7d64/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.3 deep-scrub ok
Dec  1 04:49:31 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1ee1bb90e2e5801f1fe287e62b0b38101cd61c03a0fd58d994bf8e1d09a7d64/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:31 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1ee1bb90e2e5801f1fe287e62b0b38101cd61c03a0fd58d994bf8e1d09a7d64/merged/var/lib/ceph/mgr/ceph-compute-1.ymizfm supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:31 np0005540826 podman[80314]: 2025-12-01 09:49:31.617900286 +0000 UTC m=+0.101303143 container init 39c0da9ad64f7dc8899b0c294ec018ba6b37cc1c997f080b16312079963025ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec  1 04:49:31 np0005540826 podman[80314]: 2025-12-01 09:49:31.628599093 +0000 UTC m=+0.112001930 container start 39c0da9ad64f7dc8899b0c294ec018ba6b37cc1c997f080b16312079963025ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:31 np0005540826 bash[80314]: 39c0da9ad64f7dc8899b0c294ec018ba6b37cc1c997f080b16312079963025ff
Dec  1 04:49:31 np0005540826 podman[80314]: 2025-12-01 09:49:31.537907265 +0000 UTC m=+0.021310122 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:31 np0005540826 systemd[1]: Started Ceph mgr.compute-1.ymizfm for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'alerts'
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'balancer'
Dec  1 04:49:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:31.799+0000 7f5c97fb5140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:31 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'cephadm'
Dec  1 04:49:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:31.879+0000 7f5c97fb5140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:32 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/1098136028' entity='client.admin' 
Dec  1 04:49:32 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec  1 04:49:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec  1 04:49:32 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'crash'
Dec  1 04:49:32 np0005540826 ceph-mgr[80334]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:32.760+0000 7f5c97fb5140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'dashboard'
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:33.457+0000 7f5c97fb5140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:49:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec  1 04:49:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]:  from numpy import show_config as show_numpy_config
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'influx'
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:33.630+0000 7f5c97fb5140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: Deploying daemon crash.compute-2 on compute-2
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: Saving service ingress.rgw.default spec with placement count:2
Dec  1 04:49:33 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:33.708+0000 7f5c97fb5140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'insights'
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'iostat'
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:49:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:33.860+0000 7f5c97fb5140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'localpool'
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:49:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e34 _set_new_cache_sizes cache_size:1019928146 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mirroring'
Dec  1 04:49:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec  1 04:49:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'nfs'
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:49:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:34.939+0000 7f5c97fb5140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.173+0000 7f5c97fb5140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.255+0000 7f5c97fb5140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_support'
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.327+0000 7f5c97fb5140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'progress'
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.430+0000 7f5c97fb5140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.514+0000 7f5c97fb5140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'prometheus'
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Dec  1 04:49:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e35 e35: 2 total, 2 up, 2 in
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.126127243s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169731140s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.126091003s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169731140s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.126006126s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169670105s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.18( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125990868s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169670105s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125973701s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169776917s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.19( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125914574s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169723511s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.19( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125904083s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169723511s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.126134872s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169967651s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125944138s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169776917s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1a( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.126119614s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169967651s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125979424s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169898987s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125966072s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169898987s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125809669s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169998169s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125723839s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.169921875s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125784874s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169998169s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125792503s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170028687s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1c( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125702858s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.169921875s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.e( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125782013s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170028687s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125814438s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170204163s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.2( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125793457s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170204163s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125607491s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170135498s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.7( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125569344s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170112610s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.4( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125581741s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170135498s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.7( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125555992s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170112610s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125619888s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170219421s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.7( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125601768s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170219421s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.3( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125503540s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170265198s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.3( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125491142s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170265198s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.2( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125662804s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170471191s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125459671s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170272827s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.2( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125649452s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170471191s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125444412s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170272827s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.d( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125740051s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170074463s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.d( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125170708s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170074463s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.5( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125468254s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170402527s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.5( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125457764s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170402527s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125366211s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170402527s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.e( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125353813s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170402527s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.8( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125370979s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170516968s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.8( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125353813s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170516968s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125340462s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170524597s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.a( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125331879s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170524597s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.15( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125317574s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170539856s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.15( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125301361s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170539856s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125283241s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170578003s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.16( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125273705s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170578003s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.17( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125270844s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170639038s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.17( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125249863s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170639038s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125390053s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170829773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.15( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125374794s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170829773s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125407219s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170951843s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.12( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125422478s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170967102s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.10( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125385284s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170951843s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.12( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125388145s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170967102s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125454903s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.171058655s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.11( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125440598s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.171058655s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1c( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125296593s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.171028137s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125273705s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.171028137s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.1f( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125247955s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.171028137s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.124732018s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 active pruub 79.170555115s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[5.9( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.124720573s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.170555115s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[6.1c( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=35 pruub=10.125282288s) [1] r=-1 lpr=35 pi=[32,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 79.171028137s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.1d( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.10( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.13( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.14( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.a( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.b( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.9( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.8( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.e( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.6( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.4( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.2( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.3( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.1e( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.f( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.18( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 35 pg[7.1b( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:35.922+0000 7f5c97fb5140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:49:36 np0005540826 ceph-mgr[80334]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'restful'
Dec  1 04:49:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:36.027+0000 7f5c97fb5140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rgw'
Dec  1 04:49:36 np0005540826 ceph-mgr[80334]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rook'
Dec  1 04:49:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:36.517+0000 7f5c97fb5140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec  1 04:49:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'selftest'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.103+0000 7f5c97fb5140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e36 e36: 2 total, 2 up, 2 in
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: Saving service node-exporter spec with placement *
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: Saving service grafana spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: Saving service prometheus spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: Saving service alertmanager spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:49:37 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.13( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.a( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.1d( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.9( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.8( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.f( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.e( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.4( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.3( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.2( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.18( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.1e( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.14( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.b( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.10( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.6( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 36 pg[7.1b( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=35) [0] r=0 lpr=35 pi=[33,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.194+0000 7f5c97fb5140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'stats'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.292+0000 7f5c97fb5140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'status'
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telegraf'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.446+0000 7f5c97fb5140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telemetry'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.518+0000 7f5c97fb5140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec  1 04:49:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.678+0000 7f5c97fb5140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'volumes'
Dec  1 04:49:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:37.937+0000 7f5c97fb5140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:38 np0005540826 ceph-mgr[80334]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'zabbix'
Dec  1 04:49:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:38.210+0000 7f5c97fb5140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:38 np0005540826 ceph-mgr[80334]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:38.281+0000 7f5c97fb5140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:38 np0005540826 ceph-mgr[80334]: ms_deliver_dispatch: unhandled message 0x5603fddd8d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Dec  1 04:49:38 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3669410899' entity='client.admin' 
Dec  1 04:49:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec  1 04:49:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec  1 04:49:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020053071 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec  1 04:49:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec  1 04:49:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e37 e37: 3 total, 2 up, 3 in
Dec  1 04:49:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec  1 04:49:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec  1 04:49:41 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/819597' entity='client.admin' 
Dec  1 04:49:41 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.102:0/1836222916' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec  1 04:49:41 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec  1 04:49:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec  1 04:49:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec  1 04:49:42 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]': finished
Dec  1 04:49:42 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/88022779' entity='client.admin' 
Dec  1 04:49:42 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Dec  1 04:49:42 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Dec  1 04:49:43 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4060395120' entity='client.admin' 
Dec  1 04:49:43 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec  1 04:49:43 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec  1 04:49:43 np0005540826 python3[80391]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:49:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054708 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:44 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec  1 04:49:44 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec  1 04:49:45 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/1919605233' entity='client.admin' 
Dec  1 04:49:45 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec  1 04:49:45 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec  1 04:49:46 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec  1 04:49:46 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec  1 04:49:47 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec  1 04:49:47 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec  1 04:49:47 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3251606112' entity='client.admin' 
Dec  1 04:49:48 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Dec  1 04:49:48 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok
Dec  1 04:49:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:49 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec  1 04:49:49 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec  1 04:49:49 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  1 04:49:50 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.11 deep-scrub starts
Dec  1 04:49:50 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.11 deep-scrub ok
Dec  1 04:49:50 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  1 04:49:50 np0005540826 ceph-mon[80026]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:49:51 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec  1 04:49:51 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec  1 04:49:51 np0005540826 ceph-mon[80026]: Deploying daemon osd.2 on compute-2
Dec  1 04:49:51 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  1: '-n'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  2: 'mgr.compute-1.ymizfm'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  3: '-f'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  4: '--setuser'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  5: 'ceph'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  6: '--setgroup'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  7: 'ceph'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:49:52 np0005540826 systemd[1]: session-20.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-32.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-32.scope: Consumed 57.449s CPU time.
Dec  1 04:49:52 np0005540826 systemd[1]: session-27.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-23.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-26.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-29.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-31.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-25.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-24.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-28.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-30.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd[1]: session-22.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 20 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 32 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 30 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 23 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 27 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 25 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 29 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 22 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 26 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 24 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 28 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Session 31 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 20.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 32.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 27.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 23.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 26.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 29.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 31.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 25.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 24.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 28.
Dec  1 04:49:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setuser ceph since I am not root
Dec  1 04:49:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setgroup ceph since I am not root
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 30.
Dec  1 04:49:52 np0005540826 systemd-logind[787]: Removed session 22.
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'alerts'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:52.319+0000 7f2ca604b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'balancer'
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:52.408+0000 7f2ca604b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'cephadm'
Dec  1 04:49:52 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec  1 04:49:52 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec  1 04:49:53 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'crash'
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:53.265+0000 7f2ca604b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'dashboard'
Dec  1 04:49:53 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec  1 04:49:53 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:53 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:49:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:53.888+0000 7f2ca604b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]:  from numpy import show_config as show_numpy_config
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:54.056+0000 7f2ca604b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'influx'
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:54.123+0000 7f2ca604b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'insights'
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'iostat'
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:54.265+0000 7f2ca604b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:49:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:54 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec  1 04:49:54 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'localpool'
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:49:54 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mirroring'
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'nfs'
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.338+0000 7f2ca604b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.569+0000 7f2ca604b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.651+0000 7f2ca604b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_support'
Dec  1 04:49:55 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec  1 04:49:55 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.725+0000 7f2ca604b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.806+0000 7f2ca604b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'progress'
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'prometheus'
Dec  1 04:49:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:55.879+0000 7f2ca604b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:56.232+0000 7f2ca604b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:56.322+0000 7f2ca604b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'restful'
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rgw'
Dec  1 04:49:56 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec  1 04:49:56 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:56.751+0000 7f2ca604b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rook'
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.313+0000 7f2ca604b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'selftest'
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.383+0000 7f2ca604b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.464+0000 7f2ca604b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'stats'
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'status'
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telegraf'
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.626+0000 7f2ca604b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.704+0000 7f2ca604b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telemetry'
Dec  1 04:49:57 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec  1 04:49:57 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:57.870+0000 7f2ca604b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:58.100+0000 7f2ca604b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'volumes'
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e38 e38: 3 total, 2 up, 3 in
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:58.363+0000 7f2ca604b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'zabbix'
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:49:58.440+0000 7f2ca604b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: mgr load Constructed class from module: dashboard
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Starting engine...
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: ms_deliver_dispatch: unhandled message 0x55664124b860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: Active manager daemon compute-0.fospow restarted
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: Activating manager daemon compute-0.fospow
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: Manager daemon compute-0.fospow is now available
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:49:58 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Engine started...
Dec  1 04:49:58 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts
Dec  1 04:49:58 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok
Dec  1 04:49:58 np0005540826 systemd-logind[787]: New session 33 of user ceph-admin.
Dec  1 04:49:58 np0005540826 systemd[1]: Started Session 33 of User ceph-admin.
Dec  1 04:49:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Dec  1 04:49:59 np0005540826 podman[80574]: 2025-12-01 09:49:59.607689734 +0000 UTC m=+0.074634168 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:59 np0005540826 podman[80574]: 2025-12-01 09:49:59.730296877 +0000 UTC m=+0.197241311 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:59 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec  1 04:49:59 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:49:59] ENGINE Bus STARTING
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: overall HEALTH_OK
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:00] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:00] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:00] ENGINE Bus STARTED
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:00] ENGINE Client ('192.168.122.100', 46558) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec  1 04:50:00 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.19( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.970262527s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957328796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.19( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.970262527s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957328796s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.963107109s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 active pruub 104.950248718s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[6.1b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182925224s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170082092s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.963107109s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950248718s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[6.1b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182925224s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170082092s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.1d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969916344s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957206726s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.1d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969916344s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957206726s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.1b( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182711601s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170036316s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.1b( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182711601s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170036316s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.3( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969841957s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957336426s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.3( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969841957s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[6.1( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182913780s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170494080s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[6.1( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182913780s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170494080s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.6( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969486237s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957130432s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.6( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969486237s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957130432s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.2( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969451904s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957214355s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.2( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969451904s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957214355s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.962411880s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 active pruub 104.950180054s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.962411880s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950180054s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.0( empty local-lis/les=32/33 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182816505s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170669556s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.1c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969414711s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.957336426s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.0( empty local-lis/les=32/33 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182816505s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170669556s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.961943626s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 active pruub 104.950019836s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=39 pruub=9.961943626s) [] r=-1 lpr=39 pi=[29,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950019836s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182596207s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170806885s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182596207s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170806885s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.a( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181622505s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.169898987s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.d( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181388855s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.169723511s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.d( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181388855s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169723511s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.a( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181622505s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169898987s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.c( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181184769s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.169570923s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.c( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181184769s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182579041s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.171005249s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182579041s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171005249s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.a( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.579078674s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 active pruub 110.567581177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.a( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.579078674s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567581177s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182492256s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.171066284s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182492256s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171066284s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.1c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.969414711s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.10( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181400299s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.170150757s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.10( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.181400299s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170150757s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.13( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.180770874s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.169563293s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.13( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.180770874s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169563293s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.14( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.578892708s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 active pruub 110.567726135s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.14( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.967916489s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 active pruub 105.956817627s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[4.14( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=39 pruub=10.967916489s) [] r=-1 lpr=39 pi=[30,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.956817627s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.14( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.578892708s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567726135s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.15( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.180565834s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.169570923s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182383537s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.171409607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182183266s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 active pruub 103.171203613s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182183266s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171203613s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[2.15( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.180565834s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=39 pruub=8.182383537s) [] r=-1 lpr=39 pi=[32,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171409607s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.1d( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.578374863s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 active pruub 110.567642212s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 39 pg[7.1d( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=39 pruub=15.578374863s) [] r=-1 lpr=39 pi=[35,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567642212s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec  1 04:50:01 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:01 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:02 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec  1 04:50:02 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Unable to set osd_memory_target on compute-0 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:03 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec  1 04:50:03 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:04 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.18 deep-scrub starts
Dec  1 04:50:04 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.18 deep-scrub ok
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-mon[80026]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.1f deep-scrub starts
Dec  1 04:50:05 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.1f deep-scrub ok
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  1: '-n'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  2: 'mgr.compute-1.ymizfm'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  3: '-f'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  4: '--setuser'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  5: 'ceph'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  6: '--setgroup'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  7: 'ceph'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:50:06 np0005540826 systemd-logind[787]: Session 33 logged out. Waiting for processes to exit.
Dec  1 04:50:06 np0005540826 systemd[1]: session-33.scope: Deactivated successfully.
Dec  1 04:50:06 np0005540826 systemd[1]: session-33.scope: Consumed 4.570s CPU time.
Dec  1 04:50:06 np0005540826 systemd-logind[787]: Removed session 33.
Dec  1 04:50:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setuser ceph since I am not root
Dec  1 04:50:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setgroup ceph since I am not root
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'alerts'
Dec  1 04:50:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec  1 04:50:06 np0005540826 ceph-mon[80026]: OSD bench result of 4501.924530 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:50:06 np0005540826 ceph-mon[80026]: Deploying daemon node-exporter.compute-0 on compute-0
Dec  1 04:50:06 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[6.1b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.067525864s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170082092s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[6.1b( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.067498207s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170082092s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.19( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854603767s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957328796s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.19( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854592323s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957328796s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.1b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.847329617s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950248718s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.1b( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.847315311s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950248718s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.1d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854078293s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957206726s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.1d( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854055405s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957206726s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.1c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854241371s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.1b( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066797495s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170036316s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.1c( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.854108810s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.1b( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066783667s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170036316s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.3( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853828907s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.3( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853758812s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957336426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[6.1( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066865444s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170494080s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[6.1( empty local-lis/les=32/33 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066842794s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170494080s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.8( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.846455574s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950180054s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.6( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853385925s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957130432s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.2( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853427887s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957214355s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.8( empty local-lis/les=29/30 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.846403599s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950180054s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.2( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853411674s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957214355s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.6( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.853363514s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.957130432s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.0( empty local-lis/les=32/33 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066425562s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170669556s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.0( empty local-lis/les=32/33 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066410303s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170669556s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.a( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065604448s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169898987s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.a( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065589428s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169898987s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.845781326s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950019836s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066381454s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170806885s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.d( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066359282s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170806885s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[3.0( empty local-lis/les=29/30 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40 pruub=4.845582008s) [2] r=-1 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.950019836s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.c( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.064831495s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066249132s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171005249s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.c( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.064816475s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.a( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.462810516s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567581177s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.b( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066227198s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171005249s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.a( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.462793350s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567581177s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066205740s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171066284s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.8( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.066192865s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171066284s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.14( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.462756157s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567726135s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.10( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065156698s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170150757s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.14( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.462737083s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567726135s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.10( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065144539s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.170150757s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.d( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065067768s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169723511s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.d( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.064538717s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169723511s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.13( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.064140320s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169563293s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.13( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.064065456s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169563293s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.14( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.851167202s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.956817627s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.15( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.063875437s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[4.14( empty local-lis/les=30/31 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=5.851136684s) [2] r=-1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 105.956817627s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065478086s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171409607s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.13( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065461874s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171409607s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[2.15( empty local-lis/les=32/33 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.063805342s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.169570923s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065026045s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171203613s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[5.12( empty local-lis/les=32/33 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40 pruub=3.065009117s) [2] r=-1 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.171203613s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.1d( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.461213112s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567642212s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 40 pg[7.1d( empty local-lis/les=35/36 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40 pruub=10.461191177s) [2] r=-1 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.567642212s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec  1 04:50:06 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'balancer'
Dec  1 04:50:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:06.795+0000 7fa33f299140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:06.911+0000 7fa33f299140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:06 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'cephadm'
Dec  1 04:50:07 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec  1 04:50:07 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec  1 04:50:07 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'crash'
Dec  1 04:50:07 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  1 04:50:07 np0005540826 ceph-mon[80026]: osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015] boot
Dec  1 04:50:07 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  1 04:50:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:07.907+0000 7fa33f299140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540826 ceph-mgr[80334]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'dashboard'
Dec  1 04:50:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec  1 04:50:08 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:50:08 np0005540826 ceph-mgr[80334]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:50:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:08.722+0000 7fa33f299140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec  1 04:50:08 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec  1 04:50:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:50:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:50:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]:  from numpy import show_config as show_numpy_config
Dec  1 04:50:08 np0005540826 ceph-mgr[80334]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:08.935+0000 7fa33f299140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'influx'
Dec  1 04:50:08 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:09.026+0000 7fa33f299140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'insights'
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'iostat'
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:50:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:09.192+0000 7fa33f299140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'localpool'
Dec  1 04:50:09 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:50:09 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.6 deep-scrub starts
Dec  1 04:50:09 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.6 deep-scrub ok
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mirroring'
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'nfs'
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.380+0000 7fa33f299140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.635+0000 7fa33f299140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.737+0000 7fa33f299140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_support'
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.819+0000 7fa33f299140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec  1 04:50:10 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'progress'
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.912+0000 7fa33f299140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'prometheus'
Dec  1 04:50:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:10.996+0000 7fa33f299140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:50:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:11.371+0000 7fa33f299140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:11.489+0000 7fa33f299140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'restful'
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rgw'
Dec  1 04:50:11 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec  1 04:50:11 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rook'
Dec  1 04:50:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:11.953+0000 7fa33f299140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:12.543+0000 7fa33f299140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'selftest'
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:12.618+0000 7fa33f299140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'stats'
Dec  1 04:50:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:12.702+0000 7fa33f299140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'status'
Dec  1 04:50:12 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec  1 04:50:12 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec  1 04:50:12 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:12.857+0000 7fa33f299140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telegraf'
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telemetry'
Dec  1 04:50:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:12.928+0000 7fa33f299140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540826 ceph-mon[80026]: Active manager daemon compute-0.fospow restarted
Dec  1 04:50:12 np0005540826 ceph-mon[80026]: Activating manager daemon compute-0.fospow
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.094+0000 7fa33f299140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'volumes'
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.323+0000 7fa33f299140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'zabbix'
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.604+0000 7fa33f299140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.668+0000 7fa33f299140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: ms_deliver_dispatch: unhandled message 0x559a489af860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  1: '-n'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  2: 'mgr.compute-1.ymizfm'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  3: '-f'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  4: '--setuser'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  5: 'ceph'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  6: '--setgroup'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  7: 'ceph'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setuser ceph since I am not root
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setgroup ceph since I am not root
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'alerts'
Dec  1 04:50:13 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec  1 04:50:13 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'balancer'
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.889+0000 7efd07065140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'cephadm'
Dec  1 04:50:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:13.963+0000 7efd07065140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:14 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'crash'
Dec  1 04:50:14 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec  1 04:50:14 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec  1 04:50:14 np0005540826 ceph-mgr[80334]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'dashboard'
Dec  1 04:50:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:14.800+0000 7efd07065140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:15.449+0000 7efd07065140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]:  from numpy import show_config as show_numpy_config
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:15.619+0000 7efd07065140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'influx'
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'insights'
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:15.689+0000 7efd07065140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'iostat'
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:15.822+0000 7efd07065140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:50:15 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec  1 04:50:15 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec  1 04:50:15 np0005540826 systemd[72588]: Starting Mark boot as successful...
Dec  1 04:50:15 np0005540826 systemd[72588]: Finished Mark boot as successful.
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'localpool'
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mirroring'
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'nfs'
Dec  1 04:50:16 np0005540826 systemd[1]: Stopping User Manager for UID 42477...
Dec  1 04:50:16 np0005540826 systemd[72588]: Activating special unit Exit the Session...
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped target Main User Target.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped target Basic System.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped target Paths.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped target Sockets.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped target Timers.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:50:16 np0005540826 systemd[72588]: Closed D-Bus User Message Bus Socket.
Dec  1 04:50:16 np0005540826 systemd[72588]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:50:16 np0005540826 systemd[72588]: Removed slice User Application Slice.
Dec  1 04:50:16 np0005540826 systemd[72588]: Reached target Shutdown.
Dec  1 04:50:16 np0005540826 systemd[72588]: Finished Exit the Session.
Dec  1 04:50:16 np0005540826 systemd[72588]: Reached target Exit the Session.
Dec  1 04:50:16 np0005540826 systemd[1]: user@42477.service: Deactivated successfully.
Dec  1 04:50:16 np0005540826 systemd[1]: Stopped User Manager for UID 42477.
Dec  1 04:50:16 np0005540826 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec  1 04:50:16 np0005540826 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec  1 04:50:16 np0005540826 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec  1 04:50:16 np0005540826 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec  1 04:50:16 np0005540826 systemd[1]: Removed slice User Slice of UID 42477.
Dec  1 04:50:16 np0005540826 systemd[1]: user-42477.slice: Consumed 1min 3.541s CPU time.
Dec  1 04:50:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec  1 04:50:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:50:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:16.860+0000 7efd07065140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.091+0000 7efd07065140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.171+0000 7efd07065140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_support'
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.242+0000 7efd07065140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.322+0000 7efd07065140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'progress'
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.397+0000 7efd07065140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'prometheus'
Dec  1 04:50:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.778+0000 7efd07065140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:50:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec  1 04:50:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:17.883+0000 7efd07065140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:17 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'restful'
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rgw'
Dec  1 04:50:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:18.304+0000 7efd07065140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rook'
Dec  1 04:50:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec  1 04:50:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec  1 04:50:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:18.867+0000 7efd07065140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'selftest'
Dec  1 04:50:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:18.939+0000 7efd07065140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.026+0000 7efd07065140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'stats'
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'status'
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.187+0000 7efd07065140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telegraf'
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.254+0000 7efd07065140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telemetry'
Dec  1 04:50:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.415+0000 7efd07065140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:50:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.641+0000 7efd07065140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'volumes'
Dec  1 04:50:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Dec  1 04:50:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Dec  1 04:50:19 np0005540826 ceph-mon[80026]: Active manager daemon compute-0.fospow restarted
Dec  1 04:50:19 np0005540826 ceph-mon[80026]: Activating manager daemon compute-0.fospow
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.917+0000 7efd07065140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'zabbix'
Dec  1 04:50:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:50:19.985+0000 7efd07065140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: mgr load Constructed class from module: dashboard
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: ms_deliver_dispatch: unhandled message 0x563ef141d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:50:19 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Starting engine...
Dec  1 04:50:20 np0005540826 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:50:20 np0005540826 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:50:20 np0005540826 systemd-logind[787]: New session 34 of user ceph-admin.
Dec  1 04:50:20 np0005540826 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:50:20 np0005540826 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:50:20 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Engine started...
Dec  1 04:50:20 np0005540826 systemd[81783]: Queued start job for default target Main User Target.
Dec  1 04:50:20 np0005540826 systemd[81783]: Created slice User Application Slice.
Dec  1 04:50:20 np0005540826 systemd[81783]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:50:20 np0005540826 systemd[81783]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:50:20 np0005540826 systemd[81783]: Reached target Paths.
Dec  1 04:50:20 np0005540826 systemd[81783]: Reached target Timers.
Dec  1 04:50:20 np0005540826 systemd[81783]: Starting D-Bus User Message Bus Socket...
Dec  1 04:50:20 np0005540826 systemd[81783]: Starting Create User's Volatile Files and Directories...
Dec  1 04:50:20 np0005540826 systemd[81783]: Finished Create User's Volatile Files and Directories.
Dec  1 04:50:20 np0005540826 systemd[81783]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:50:20 np0005540826 systemd[81783]: Reached target Sockets.
Dec  1 04:50:20 np0005540826 systemd[81783]: Reached target Basic System.
Dec  1 04:50:20 np0005540826 systemd[81783]: Reached target Main User Target.
Dec  1 04:50:20 np0005540826 systemd[81783]: Startup finished in 127ms.
Dec  1 04:50:20 np0005540826 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:50:20 np0005540826 systemd[1]: Started Session 34 of User ceph-admin.
Dec  1 04:50:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec  1 04:50:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec  1 04:50:20 np0005540826 podman[81920]: 2025-12-01 09:50:20.913688852 +0000 UTC m=+0.061214070 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e2 new map
Dec  1 04:50:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e2 print_map#012e2#012btime 2025-12-01T09:50:20:704588+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:20.704523+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Dec  1 04:50:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec  1 04:50:20 np0005540826 podman[81920]: 2025-12-01 09:50:20.999164831 +0000 UTC m=+0.146690049 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: Manager daemon compute-0.fospow is now available
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  1 04:50:21 np0005540826 ceph-mon[80026]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  1 04:50:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec  1 04:50:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:21] ENGINE Bus STARTING
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:21] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:21] ENGINE Client ('192.168.122.100', 56006) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec  1 04:50:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:21] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:50:21] ENGINE Bus STARTED
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:23 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec  1 04:50:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec  1 04:50:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 45 pg[8.0( empty local-lis/les=0/0 n=0 ec=45/45 lis/c=0/0 les/c/f=0/0/0 sis=45) [0] r=0 lpr=45 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec  1 04:50:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.1e deep-scrub starts
Dec  1 04:50:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.1e deep-scrub ok
Dec  1 04:50:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec  1 04:50:25 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 46 pg[8.0( empty local-lis/les=45/46 n=0 ec=45/45 lis/c=0/0 les/c/f=0/0/0 sis=45) [0] r=0 lpr=45 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec  1 04:50:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec  1 04:50:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec  1 04:50:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec  1 04:50:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec  1 04:50:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec  1 04:50:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec  1 04:50:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec  1 04:50:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec  1 04:50:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec  1 04:50:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec  1 04:50:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec  1 04:50:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec  1 04:50:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:32 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.e deep-scrub starts
Dec  1 04:50:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.e deep-scrub ok
Dec  1 04:50:32 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:32 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:32 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540826 ceph-mon[80026]: Deploying daemon node-exporter.compute-1 on compute-1
Dec  1 04:50:33 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:33 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:33 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:33 np0005540826 systemd[1]: Starting Ceph node-exporter.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:33 np0005540826 bash[83261]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec  1 04:50:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.b deep-scrub starts
Dec  1 04:50:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.b deep-scrub ok
Dec  1 04:50:34 np0005540826 bash[83261]: Getting image source signatures
Dec  1 04:50:34 np0005540826 bash[83261]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec  1 04:50:34 np0005540826 bash[83261]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec  1 04:50:34 np0005540826 bash[83261]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec  1 04:50:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:34 np0005540826 bash[83261]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec  1 04:50:34 np0005540826 bash[83261]: Writing manifest to image destination
Dec  1 04:50:34 np0005540826 podman[83261]: 2025-12-01 09:50:34.690677716 +0000 UTC m=+1.112787823 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec  1 04:50:34 np0005540826 podman[83261]: 2025-12-01 09:50:34.706800025 +0000 UTC m=+1.128910112 container create b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5fe87243cbaed84b0016583725302c91f00533def1ef19477972bc98c8ee490/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:34 np0005540826 podman[83261]: 2025-12-01 09:50:34.766355044 +0000 UTC m=+1.188465131 container init b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:34 np0005540826 podman[83261]: 2025-12-01 09:50:34.772475883 +0000 UTC m=+1.194585970 container start b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:34 np0005540826 bash[83261]: b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.779Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.779Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.780Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.780Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.781Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.781Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=arp
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=bcache
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=bonding
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=cpu
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=dmi
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=edac
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=entropy
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=filefd
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=hwmon
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=netclass
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=netdev
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=netstat
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=nfs
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=nvme
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=os
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=pressure
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=rapl
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=selinux
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=softnet
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=stat
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=textfile
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=time
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=uname
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=xfs
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.782Z caller=node_exporter.go:117 level=info collector=zfs
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.783Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec  1 04:50:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1[83337]: ts=2025-12-01T09:50:34.783Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec  1 04:50:34 np0005540826 systemd[1]: Started Ceph node-exporter.compute-1 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Dec  1 04:50:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540826 ceph-mon[80026]: Deploying daemon node-exporter.compute-2 on compute-2
Dec  1 04:50:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Dec  1 04:50:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Dec  1 04:50:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Dec  1 04:50:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Dec  1 04:50:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec  1 04:50:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/176832347' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:50:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:50:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Dec  1 04:50:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Dec  1 04:50:39 np0005540826 ceph-mon[80026]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Dec  1 04:50:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Dec  1 04:50:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Dec  1 04:50:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec  1 04:50:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec  1 04:50:40 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Dec  1 04:50:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Dec  1 04:50:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:44 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:44 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:45 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:45 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: Deploying daemon rgw.rgw.compute-2.ugomkp on compute-2
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec  1 04:50:46 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 48 pg[9.0( empty local-lis/les=0/0 n=0 ec=48/48 lis/c=0/0 les/c/f=0/0/0 sis=48) [0] r=0 lpr=48 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.419927676 +0000 UTC m=+0.055636318 container create a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  1 04:50:47 np0005540826 systemd[1]: Started libpod-conmon-a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913.scope.
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.387321268 +0000 UTC m=+0.023029920 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:47 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.510130182 +0000 UTC m=+0.145838854 container init a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.522092833 +0000 UTC m=+0.157801485 container start a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.525349748 +0000 UTC m=+0.161058430 container attach a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec  1 04:50:47 np0005540826 relaxed_mcnulty[83454]: 167 167
Dec  1 04:50:47 np0005540826 systemd[1]: libpod-a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913.scope: Deactivated successfully.
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.532206706 +0000 UTC m=+0.167915358 container died a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:50:47 np0005540826 systemd[1]: var-lib-containers-storage-overlay-e8c412fac90bf8c9c917c5e38cb1e8c2c6b68068389d9fb100df924c3e17029f-merged.mount: Deactivated successfully.
Dec  1 04:50:47 np0005540826 podman[83438]: 2025-12-01 09:50:47.586539269 +0000 UTC m=+0.222247921 container remove a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_mcnulty, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 04:50:47 np0005540826 systemd[1]: libpod-conmon-a272f8860674ee4f19d3f3e07986ad2da1df71486f0b4c69f460437bd38d5913.scope: Deactivated successfully.
Dec  1 04:50:47 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:47 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:47 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:47 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:48 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:48 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:48 np0005540826 ceph-mon[80026]: Deploying daemon rgw.rgw.compute-1.alkudt on compute-1
Dec  1 04:50:48 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.102:0/1702895159' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  1 04:50:48 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  1 04:50:48 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec  1 04:50:48 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 49 pg[9.0( empty local-lis/les=48/49 n=0 ec=48/48 lis/c=0/0 les/c/f=0/0/0 sis=48) [0] r=0 lpr=48 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:48 np0005540826 systemd[1]: Starting Ceph rgw.rgw.compute-1.alkudt for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:48 np0005540826 podman[83594]: 2025-12-01 09:50:48.521555057 +0000 UTC m=+0.052056365 container create 3367f534d95cc5ef53fad33ca1091b3e59266c1c7d44573bb95c1a0d3d37dc65 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-1-alkudt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:48 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15fa16088d2c055d19e31084c9c38e1e85e5db3d582986321b8dcb06c970d12d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:48 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15fa16088d2c055d19e31084c9c38e1e85e5db3d582986321b8dcb06c970d12d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:48 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15fa16088d2c055d19e31084c9c38e1e85e5db3d582986321b8dcb06c970d12d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:48 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15fa16088d2c055d19e31084c9c38e1e85e5db3d582986321b8dcb06c970d12d/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.alkudt supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:48 np0005540826 podman[83594]: 2025-12-01 09:50:48.578974731 +0000 UTC m=+0.109476039 container init 3367f534d95cc5ef53fad33ca1091b3e59266c1c7d44573bb95c1a0d3d37dc65 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-1-alkudt, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:50:48 np0005540826 podman[83594]: 2025-12-01 09:50:48.584544636 +0000 UTC m=+0.115045944 container start 3367f534d95cc5ef53fad33ca1091b3e59266c1c7d44573bb95c1a0d3d37dc65 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-1-alkudt, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:50:48 np0005540826 bash[83594]: 3367f534d95cc5ef53fad33ca1091b3e59266c1c7d44573bb95c1a0d3d37dc65
Dec  1 04:50:48 np0005540826 podman[83594]: 2025-12-01 09:50:48.502591234 +0000 UTC m=+0.033092642 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:48 np0005540826 systemd[1]: Started Ceph rgw.rgw.compute-1.alkudt for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:48 np0005540826 radosgw[83613]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:48 np0005540826 radosgw[83613]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec  1 04:50:48 np0005540826 radosgw[83613]: framework: beast
Dec  1 04:50:48 np0005540826 radosgw[83613]: framework conf key: endpoint, val: 192.168.122.101:8082
Dec  1 04:50:48 np0005540826 radosgw[83613]: init_numa not setting numa affinity
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: Deploying daemon rgw.rgw.compute-0.mxrshg on compute-0
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:50 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec  1 04:50:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec  1 04:50:52 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  1 04:50:52 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  1 04:50:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Dec  1 04:50:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:52 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 52 pg[11.0( empty local-lis/les=0/0 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: Deploying daemon mds.cephfs.compute-2.yoegjc on compute-2
Dec  1 04:50:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec  1 04:50:53 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 53 pg[11.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e3 new map
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e3 print_map#012e3#012btime 2025-12-01T09:50:54:337178+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:20.704523+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.yoegjc{-1:24223} state up:standby seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e4 new map
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e4 print_map#012e4#012btime 2025-12-01T09:50:54:367365+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:54.367356+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.yoegjc{0:24223} state up:creating seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  1 04:50:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: Deploying daemon mds.cephfs.compute-0.xijran on compute-0
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: daemon mds.cephfs.compute-2.yoegjc assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: daemon mds.cephfs.compute-2.yoegjc is now active in filesystem cephfs as rank 0
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e5 new map
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e5 print_map#012e5#012btime 2025-12-01T09:50:55:377739+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec  1 04:50:55 np0005540826 ceph-mon[80026]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e6 new map
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e6 print_map#012e6#012btime 2025-12-01T09:50:56:603484+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e7 new map
Dec  1 04:50:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e7 print_map#012e7#012btime 2025-12-01T09:50:56:889938+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:57 np0005540826 radosgw[83613]: v1 topic migration: starting v1 topic migration..
Dec  1 04:50:57 np0005540826 radosgw[83613]: LDAP not started since no server URIs were provided in the configuration.
Dec  1 04:50:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-1-alkudt[83609]: 2025-12-01T09:50:57.083+0000 7f2518b88980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec  1 04:50:57 np0005540826 radosgw[83613]: v1 topic migration: finished v1 topic migration
Dec  1 04:50:57 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540826 radosgw[83613]: framework: beast
Dec  1 04:50:57 np0005540826 radosgw[83613]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec  1 04:50:57 np0005540826 radosgw[83613]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec  1 04:50:57 np0005540826 radosgw[83613]: starting handler: beast
Dec  1 04:50:57 np0005540826 radosgw[83613]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:57 np0005540826 radosgw[83613]: mgrc service_daemon_register rgw.24164 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.alkudt,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864324,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=a4b474d3-e1dd-44c2-9911-e36e5f368ef5,zone_name=default,zonegroup_id=079816e3-d8ce-476e-bcdd-2df39ad7439e,zonegroup_name=default}
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.376213965 +0000 UTC m=+0.046857479 container create b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec  1 04:50:57 np0005540826 systemd[1]: Started libpod-conmon-b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d.scope.
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.353280589 +0000 UTC m=+0.023924133 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:57 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.4701972 +0000 UTC m=+0.140840734 container init b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.480097727 +0000 UTC m=+0.150741261 container start b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.483451454 +0000 UTC m=+0.154094968 container attach b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:57 np0005540826 stupefied_euler[84343]: 167 167
Dec  1 04:50:57 np0005540826 systemd[1]: libpod-b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d.scope: Deactivated successfully.
Dec  1 04:50:57 np0005540826 conmon[84343]: conmon b8ad8539252930022cd3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d.scope/container/memory.events
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.489372748 +0000 UTC m=+0.160016272 container died b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:57 np0005540826 systemd[1]: var-lib-containers-storage-overlay-e46b6024ff950a6d548b34a414c90685c2b2990679bfc0d1479c063263a12c22-merged.mount: Deactivated successfully.
Dec  1 04:50:57 np0005540826 podman[84326]: 2025-12-01 09:50:57.522652574 +0000 UTC m=+0.193296088 container remove b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_euler, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:50:57 np0005540826 systemd[1]: libpod-conmon-b8ad8539252930022cd38224928f87508dafc25bf3680f8fe39e309a5285bd9d.scope: Deactivated successfully.
Dec  1 04:50:57 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:57 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:57 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:57 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:57 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:57 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:58 np0005540826 ceph-mon[80026]: Deploying daemon mds.cephfs.compute-1.ijlzoi on compute-1
Dec  1 04:50:58 np0005540826 ceph-mon[80026]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:58 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:58 np0005540826 ceph-mon[80026]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:58 np0005540826 systemd[1]: Starting Ceph mds.cephfs.compute-1.ijlzoi for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:58 np0005540826 podman[84483]: 2025-12-01 09:50:58.328852511 +0000 UTC m=+0.040516234 container create 3845cc645438c20ed2baf660b47764dbf367220616cddffdd2212523c4dbe351 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-1-ijlzoi, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Dec  1 04:50:58 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/321e28a33620233da857053f62ff132bc5c0d505e10165b482e2845ad284f107/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:58 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/321e28a33620233da857053f62ff132bc5c0d505e10165b482e2845ad284f107/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:58 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/321e28a33620233da857053f62ff132bc5c0d505e10165b482e2845ad284f107/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:58 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/321e28a33620233da857053f62ff132bc5c0d505e10165b482e2845ad284f107/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.ijlzoi supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:58 np0005540826 podman[84483]: 2025-12-01 09:50:58.39186694 +0000 UTC m=+0.103530683 container init 3845cc645438c20ed2baf660b47764dbf367220616cddffdd2212523c4dbe351 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-1-ijlzoi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Dec  1 04:50:58 np0005540826 podman[84483]: 2025-12-01 09:50:58.398453312 +0000 UTC m=+0.110117035 container start 3845cc645438c20ed2baf660b47764dbf367220616cddffdd2212523c4dbe351 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-1-ijlzoi, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:58 np0005540826 bash[84483]: 3845cc645438c20ed2baf660b47764dbf367220616cddffdd2212523c4dbe351
Dec  1 04:50:58 np0005540826 podman[84483]: 2025-12-01 09:50:58.309590871 +0000 UTC m=+0.021254624 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:58 np0005540826 systemd[1]: Started Ceph mds.cephfs.compute-1.ijlzoi for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:58 np0005540826 ceph-mds[84503]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:58 np0005540826 ceph-mds[84503]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  1 04:50:58 np0005540826 ceph-mds[84503]: main not setting numa affinity
Dec  1 04:50:58 np0005540826 ceph-mds[84503]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-1-ijlzoi[84499]: starting mds.cephfs.compute-1.ijlzoi at 
Dec  1 04:50:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Updating MDS map to version 7 from mon.2
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc-rgw
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e8 new map
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e8 print_map#012e8#012btime 2025-12-01T09:50:59:122025+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Updating MDS map to version 8 from mon.2
Dec  1 04:50:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Monitors have assigned me to become a standby
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.45343275 +0000 UTC m=+0.042486756 container create fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:50:59 np0005540826 systemd[1]: Started libpod-conmon-fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b.scope.
Dec  1 04:50:59 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.528179564 +0000 UTC m=+0.117233580 container init fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.434158808 +0000 UTC m=+0.023212814 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.539492598 +0000 UTC m=+0.128546584 container start fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.542736212 +0000 UTC m=+0.131790208 container attach fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:59 np0005540826 sweet_bardeen[84630]: 167 167
Dec  1 04:50:59 np0005540826 systemd[1]: libpod-fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b.scope: Deactivated successfully.
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.547443155 +0000 UTC m=+0.136497131 container died fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  1 04:50:59 np0005540826 systemd[1]: var-lib-containers-storage-overlay-e70e8708a6c0e296b5673eb5539ad942eeb32d3ec5bc8b9e50d61b7ef2ad0606-merged.mount: Deactivated successfully.
Dec  1 04:50:59 np0005540826 podman[84614]: 2025-12-01 09:50:59.587213189 +0000 UTC m=+0.176267175 container remove fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_bardeen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:59 np0005540826 systemd[1]: libpod-conmon-fa8368333846c836be61125d70646ec61284d888bd1d5fe578c137baf6fcba0b.scope: Deactivated successfully.
Dec  1 04:50:59 np0005540826 systemd[1]: Reloading.
Dec  1 04:50:59 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:59 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:59 np0005540826 systemd[1]: Reloading.
Dec  1 04:51:00 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:00 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:00 np0005540826 ceph-mon[80026]: Bind address in nfs.cephfs.0.0.compute-1.osfnzc's ganesha conf is defaulting to empty
Dec  1 04:51:00 np0005540826 ceph-mon[80026]: Deploying daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1
Dec  1 04:51:00 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:00 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:00 np0005540826 podman[84771]: 2025-12-01 09:51:00.478603391 +0000 UTC m=+0.057114996 container create c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:51:00 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6d838b6ea17cd245fdbc368ca967a1f82e440ca308a4d13386812ccfe74844d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:00 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6d838b6ea17cd245fdbc368ca967a1f82e440ca308a4d13386812ccfe74844d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:00 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6d838b6ea17cd245fdbc368ca967a1f82e440ca308a4d13386812ccfe74844d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:00 np0005540826 podman[84771]: 2025-12-01 09:51:00.45395621 +0000 UTC m=+0.032467905 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:51:00 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6d838b6ea17cd245fdbc368ca967a1f82e440ca308a4d13386812ccfe74844d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:00 np0005540826 podman[84771]: 2025-12-01 09:51:00.564938017 +0000 UTC m=+0.143449652 container init c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:51:00 np0005540826 podman[84771]: 2025-12-01 09:51:00.571397235 +0000 UTC m=+0.149908840 container start c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:51:00 np0005540826 bash[84771]: c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d
Dec  1 04:51:00 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:00 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e9 new map
Dec  1 04:51:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e9 print_map#012e9#012btime 2025-12-01T09:51:01:191346+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e10 new map
Dec  1 04:51:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e10 print_map#012e10#012btime 2025-12-01T09:51:01:219485+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01110#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:01.219484+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:replay seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: Dropping low affinity active daemon mds.cephfs.compute-2.yoegjc in favor of higher affinity standby.
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: Replacing daemon mds.cephfs.compute-2.yoegjc as rank 0 with standby daemon mds.cephfs.compute-0.xijran
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: Health check failed: 1 filesystem is degraded (FS_DEGRADED)
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e11 new map
Dec  1 04:51:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e11 print_map#012e11#012btime 2025-12-01T09:51:02:233755+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01111#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:01.259526+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:reconnect seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:03 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj
Dec  1 04:51:03 np0005540826 ceph-mon[80026]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec  1 04:51:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e12 new map
Dec  1 04:51:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e12 print_map#012e12#012btime 2025-12-01T09:51:03:341186+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:02.346773+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:rejoin seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:03 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Updating MDS map to version 12 from mon.2
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:51:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:03 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e13 new map
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).mds e13 print_map#012e13#012btime 2025-12-01T09:51:04:350567+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01113#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:04.350563+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 14532 members: 14532#012[mds.cephfs.compute-0.xijran{0:14532} state up:active seq 5 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: daemon mds.cephfs.compute-0.xijran is now active in filesystem cephfs as rank 0
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:51:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded)
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: Bind address in nfs.cephfs.1.0.compute-2.ymqwfj's ganesha conf is defaulting to empty
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: Deploying daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2
Dec  1 04:51:05 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:06 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:06 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:06 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:51:06 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:51:07 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu
Dec  1 04:51:07 np0005540826 ceph-mon[80026]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec  1 04:51:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:09 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:09 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:51:09 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu-rgw
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: Bind address in nfs.cephfs.2.0.compute-0.pytvsu's ganesha conf is defaulting to empty
Dec  1 04:51:10 np0005540826 ceph-mon[80026]: Deploying daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:12 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540826 ceph-mon[80026]: Deploying daemon haproxy.nfs.cephfs.compute-1.pwynis on compute-1
Dec  1 04:51:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.577298947 +0000 UTC m=+2.567256489 container create 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.557563754 +0000 UTC m=+2.547521276 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:51:15 np0005540826 systemd[1]: Started libpod-conmon-6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627.scope.
Dec  1 04:51:15 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.688950271 +0000 UTC m=+2.678907823 container init 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.696117187 +0000 UTC m=+2.686074719 container start 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 distracted_antonelli[85045]: 0 0
Dec  1 04:51:15 np0005540826 systemd[1]: libpod-6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627.scope: Deactivated successfully.
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.817529075 +0000 UTC m=+2.807486627 container attach 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.818557762 +0000 UTC m=+2.808515304 container died 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 systemd[1]: var-lib-containers-storage-overlay-0ecc7d3813f11cf5a5993f24d6fec7ef311f5c8847f4b9ab28c416a45b995f2a-merged.mount: Deactivated successfully.
Dec  1 04:51:15 np0005540826 podman[84932]: 2025-12-01 09:51:15.866443327 +0000 UTC m=+2.856400839 container remove 6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627 (image=quay.io/ceph/haproxy:2.3, name=distracted_antonelli)
Dec  1 04:51:15 np0005540826 systemd[1]: libpod-conmon-6b9791a956f0fbe444824a31f111449dfbb9d647a5b3a220236de17f0c9e7627.scope: Deactivated successfully.
Dec  1 04:51:15 np0005540826 systemd[1]: Reloading.
Dec  1 04:51:16 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:16 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:16 np0005540826 systemd[1]: Reloading.
Dec  1 04:51:16 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:16 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:16 np0005540826 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.pwynis for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:16 np0005540826 podman[85192]: 2025-12-01 09:51:16.739819482 +0000 UTC m=+0.045400062 container create 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 04:51:16 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:16 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f19eb65fee5526f9c0ce232d93c48bc5ffa7b474a154857b48556bec8b2cdc58/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:16 np0005540826 podman[85192]: 2025-12-01 09:51:16.71973907 +0000 UTC m=+0.025319680 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:51:16 np0005540826 podman[85192]: 2025-12-01 09:51:16.821801974 +0000 UTC m=+0.127382604 container init 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 04:51:16 np0005540826 podman[85192]: 2025-12-01 09:51:16.827259706 +0000 UTC m=+0.132840336 container start 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 04:51:16 np0005540826 bash[85192]: 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e
Dec  1 04:51:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [NOTICE] 334/095116 (2) : New worker #1 (4) forked
Dec  1 04:51:16 np0005540826 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.pwynis for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:16 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f09a0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:18 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:18 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:18 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:18 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0994001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:19 np0005540826 ceph-mon[80026]: Deploying daemon haproxy.nfs.cephfs.compute-0.alcixd on compute-0
Dec  1 04:51:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:20 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec  1 04:51:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:20 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f099c0013a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:22 np0005540826 kernel: ganesha.nfsd[84831]: segfault at 50 ip 00007f0a4dabd32e sp 00007f0a1cff8210 error 4 in libntirpc.so.5.8[7f0a4daa2000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 04:51:22 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:51:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[84787]: 01/12/2025 09:51:22 : epoch 692d6504 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f099c0013a0 fd 37 proxy ignored for local
Dec  1 04:51:22 np0005540826 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec  1 04:51:22 np0005540826 systemd[1]: Started Process Core Dump (PID 85225/UID 0).
Dec  1 04:51:23 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec  1 04:51:23 np0005540826 systemd-coredump[85226]: Process 84791 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007f0a4dabd32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:51:23 np0005540826 systemd[1]: systemd-coredump@0-85225-0.service: Deactivated successfully.
Dec  1 04:51:23 np0005540826 systemd[1]: systemd-coredump@0-85225-0.service: Consumed 1.065s CPU time.
Dec  1 04:51:24 np0005540826 podman[85232]: 2025-12-01 09:51:24.043473704 +0000 UTC m=+0.023718827 container died c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:51:24 np0005540826 systemd[1]: var-lib-containers-storage-overlay-a6d838b6ea17cd245fdbc368ca967a1f82e440ca308a4d13386812ccfe74844d-merged.mount: Deactivated successfully.
Dec  1 04:51:24 np0005540826 podman[85232]: 2025-12-01 09:51:24.07944377 +0000 UTC m=+0.059688883 container remove c502b799588b83b74f4c99801f303489ecc555e47fd37ad9c7ff503c87e0f05d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 04:51:24 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:51:24 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 04:51:24 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.455s CPU time.
Dec  1 04:51:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 59 pg[8.0( v 57'44 (0'0,57'44] local-lis/les=45/46 n=5 ec=45/45 lis/c=45/45 les/c/f=46/46/0 sis=59 pruub=13.112277985s) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 57'43 mlcod 57'43 active pruub 191.270996094s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.0( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=45/45 lis/c=45/45 les/c/f=46/46/0 sis=59 pruub=13.112277985s) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 57'43 mlcod 0'0 unknown pruub 191.270996094s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1a( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1f( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.f( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.19( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.6( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1b( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1d( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.b( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.11( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.12( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.e( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.d( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.13( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.10( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1e( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.17( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.7( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.3( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.4( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.2( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.5( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.a( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.9( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.8( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.14( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.15( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1( v 57'44 (0'0,57'44] local-lis/les=45/46 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.18( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.c( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.16( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:24 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 60 pg[8.1c( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=45/46 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[9.0( v 49'6 (0'0,49'6] local-lis/les=48/49 n=6 ec=48/48 lis/c=48/48 les/c/f=49/49/0 sis=61 pruub=9.915720940s) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 49'5 mlcod 49'5 active pruub 189.597824097s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[9.0( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=48/48 lis/c=48/48 les/c/f=49/49/0 sis=61 pruub=9.915720940s) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 49'5 mlcod 0'0 unknown pruub 189.597824097s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x55e0b6d18000) operator()   moving buffer(0x55e0b5967a68 space 0x55e0b58ee0e0 0x0~1000 clean)
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x55e0b6d18000) operator()   moving buffer(0x55e0b53eb428 space 0x55e0b587d7a0 0x0~1000 clean)
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x55e0b6d18000) operator()   moving buffer(0x55e0b5997068 space 0x55e0b58efae0 0x0~1000 clean)
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x55e0b6d18000) operator()   moving buffer(0x55e0b5967608 space 0x55e0b59611f0 0x0~1000 clean)
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.16( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.14( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.10( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.17( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.15( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.2( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.11( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.3( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.e( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.d( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.9( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.8( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.a( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.0( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=45/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 57'43 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.6( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.7( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.5( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.4( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1a( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.19( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.18( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1d( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1e( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.1c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.12( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 61 pg[8.13( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=45/45 les/c/f=46/46/0 sis=59) [0] r=0 lpr=59 pi=[45,59)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.15( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.14( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.17( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.16( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.11( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.10( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.3( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.2( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.e( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.9( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.b( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.8( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.f( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.c( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.d( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.a( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.6( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1( v 49'6 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.7( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.4( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.5( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1a( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1b( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.18( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.19( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1e( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1f( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1c( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1d( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.12( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.13( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=48/49 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.14( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.16( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.15( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.17( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.11( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.3( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.2( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.10( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.e( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.b( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.f( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.8( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.9( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.c( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.a( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.0( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=48/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 49'5 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.7( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.6( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.4( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1a( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.5( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1b( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.18( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1e( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.19( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1c( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1f( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.1d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.12( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:27 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 62 pg[9.13( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=48/48 les/c/f=49/49/0 sis=61) [0] r=0 lpr=61 pi=[48,61)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Dec  1 04:51:28 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Dec  1 04:51:28 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec  1 04:51:28 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:28 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:28 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:28 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:28 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 63 pg[11.0( v 53'48 (0'0,53'48] local-lis/les=52/53 n=8 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=63 pruub=12.645111084s) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 53'47 mlcod 53'47 active pruub 194.737747192s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:28 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 63 pg[11.0( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=63 pruub=12.645111084s) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 53'47 mlcod 0'0 unknown pruub 194.737747192s@ mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095128 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:51:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Dec  1 04:51:29 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Dec  1 04:51:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:29 np0005540826 ceph-mon[80026]: Deploying daemon haproxy.nfs.cephfs.compute-2.bdogrt on compute-2
Dec  1 04:51:29 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:29 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Dec  1 04:51:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.17( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.16( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.15( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.14( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.12( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1( v 53'48 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.13( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.c( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.b( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.a( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.9( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.d( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.e( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.f( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.8( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.2( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.3( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.4( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.5( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.6( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.7( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.18( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.19( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1a( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1b( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1c( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1d( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1e( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1f( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.10( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.11( v 53'48 lc 0'0 (0'0,53'48] local-lis/les=52/53 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.16( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.17( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.14( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.15( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.12( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.0( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 53'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.c( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.b( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.9( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.e( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.f( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.8( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.13( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.2( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.4( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.5( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.3( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.6( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.19( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.7( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.d( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.18( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1c( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1d( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1e( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1f( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.11( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.10( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:30 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 64 pg[11.1b( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=52/52 les/c/f=53/53/0 sis=63) [0] r=0 lpr=63 pi=[52,63)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Dec  1 04:51:31 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Dec  1 04:51:31 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Dec  1 04:51:32 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Dec  1 04:51:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec  1 04:51:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Dec  1 04:51:33 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Dec  1 04:51:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.f scrub starts
Dec  1 04:51:34 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.f scrub ok
Dec  1 04:51:34 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 1.
Dec  1 04:51:34 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:34 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.455s CPU time.
Dec  1 04:51:34 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:34 np0005540826 podman[85321]: 2025-12-01 09:51:34.640227574 +0000 UTC m=+0.037607494 container create 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:51:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e876a289a7c8f96978333827b4cee0e52c61705fee3fbba3b464c4e2cfdd5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e876a289a7c8f96978333827b4cee0e52c61705fee3fbba3b464c4e2cfdd5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e876a289a7c8f96978333827b4cee0e52c61705fee3fbba3b464c4e2cfdd5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db5e876a289a7c8f96978333827b4cee0e52c61705fee3fbba3b464c4e2cfdd5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:34 np0005540826 podman[85321]: 2025-12-01 09:51:34.70760468 +0000 UTC m=+0.104984620 container init 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  1 04:51:34 np0005540826 podman[85321]: 2025-12-01 09:51:34.712237349 +0000 UTC m=+0.109617279 container start 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:51:34 np0005540826 podman[85321]: 2025-12-01 09:51:34.620846578 +0000 UTC m=+0.018226498 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:51:34 np0005540826 bash[85321]: 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e
Dec  1 04:51:34 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:51:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Dec  1 04:51:35 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Dec  1 04:51:36 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:36 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:36 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:36 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.e scrub starts
Dec  1 04:51:36 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.e scrub ok
Dec  1 04:51:37 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:37 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:37 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:37 np0005540826 ceph-mon[80026]: Deploying daemon keepalived.nfs.cephfs.compute-1.wzwqmm on compute-1
Dec  1 04:51:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.d scrub starts
Dec  1 04:51:37 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.d scrub ok
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Dec  1 04:51:38 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.14( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.767738342s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691589355s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.14( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.767671585s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691589355s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.17( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.837435722s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.761795044s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.16( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.837241173s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.761718750s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.15( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.176409721s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.100875854s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.16( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.837223053s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.761718750s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.15( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.176373482s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.100875854s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.15( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.767067909s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691711426s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.15( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.767042160s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691711426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.16( v 57'44 (0'0,57'44] local-lis/les=59/61 n=2 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.761515617s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.686233521s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.14( v 64'51 (0'0,64'51] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.845439911s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 64'50 active pruub 207.770187378s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.16( v 57'44 (0'0,57'44] local-lis/les=59/61 n=2 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.761493683s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.686233521s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.16( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.176046371s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.100830078s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.14( v 64'51 (0'0,64'51] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.845408440s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 0'0 unknown NOTIFY pruub 207.770187378s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.16( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.176029205s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.100830078s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.17( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766711235s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691619873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.17( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766699791s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691619873s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.13( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.845622063s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770690918s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.10( v 65'47 (0'0,65'47] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766510010s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=65'47 lcod 61'46 mlcod 61'46 active pruub 203.691619873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.11( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175782204s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.100891113s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.13( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.845588684s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770690918s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.11( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175764084s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.100891113s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.10( v 65'47 (0'0,65'47] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766473770s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=65'47 lcod 61'46 mlcod 0'0 unknown NOTIFY pruub 203.691619873s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.12( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844991684s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770217896s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.12( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844978333s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770217896s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.17( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175573349s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.100860596s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.17( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175559044s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.100860596s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.11( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766350746s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691665649s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.11( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766336441s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691665649s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.10( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175664902s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101043701s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844926834s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770324707s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.3( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175504684s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.100906372s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.10( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175645828s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101043701s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.3( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175494194s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.100906372s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.2( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766252518s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691680908s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844901085s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770324707s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.2( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766239166s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691680908s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.3( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766174316s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691680908s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.3( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766163826s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691680908s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766097069s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691711426s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766083717s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691711426s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.e( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175421715s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101058960s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.8( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766167641s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691802979s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.9( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175542831s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101196289s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.e( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175402641s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101058960s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.9( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175528526s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101196289s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.8( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766154289s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691802979s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844581604s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770324707s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.8( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175395012s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101150513s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844571114s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770324707s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.9( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766045570s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691818237s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.9( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.766030312s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691818237s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.8( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175385475s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101150513s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.b( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175131798s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101058960s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.a( v 65'45 (0'0,65'45] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765862465s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=65'45 lcod 57'44 mlcod 57'44 active pruub 203.691802979s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.b( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175112724s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101058960s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.f( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175112724s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101089478s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.a( v 65'45 (0'0,65'45] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765832901s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=65'45 lcod 57'44 mlcod 0'0 unknown NOTIFY pruub 203.691802979s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.f( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175097466s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101089478s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.e( v 64'51 (0'0,64'51] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844586372s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 64'50 active pruub 207.770614624s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.e( v 64'51 (0'0,64'51] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844564438s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 0'0 unknown NOTIFY pruub 207.770614624s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.d( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765670776s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691741943s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.d( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765658379s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691741943s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.f( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844589233s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770721436s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175376892s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101516724s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.f( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844576836s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770721436s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765828133s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692001343s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175366402s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101516724s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765817642s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692001343s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.8( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844463348s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770690918s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765686989s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.691940308s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.8( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844454765s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770690918s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.a( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175073624s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101333618s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765676498s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.691940308s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.a( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175054550s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101333618s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.3( v 64'51 (0'0,64'51] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844451904s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 64'50 active pruub 207.770843506s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.3( v 64'51 (0'0,64'51] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844432831s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=64'49 lcod 64'50 mlcod 0'0 unknown NOTIFY pruub 207.770843506s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.17( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.835384369s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.761795044s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.6( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175096512s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101593018s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.4( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844283104s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770782471s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.6( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.175085068s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101593018s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.4( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844264984s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770782471s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.5( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844220161s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770797729s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.5( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.844208717s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770797729s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.7( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174918175s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101531982s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.7( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174900055s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101531982s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.6( v 65'45 (0'0,65'45] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765355110s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=65'45 lcod 57'44 mlcod 57'44 active pruub 203.692031860s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.6( v 65'45 (0'0,65'45] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765340805s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=65'45 lcod 57'44 mlcod 0'0 unknown NOTIFY pruub 203.692031860s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.5( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174885750s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101684570s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.5( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765254021s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692062378s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.5( v 49'6 (0'0,49'6] local-lis/les=61/62 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174872398s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101684570s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.4( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765226364s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692047119s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.5( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765236855s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692062378s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.4( v 57'44 (0'0,57'44] local-lis/les=59/61 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765213013s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692047119s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765105247s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692047119s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1b( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765094757s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692047119s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.19( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843907356s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770950317s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843977928s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.771041870s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.18( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174774170s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101852417s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.19( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843878746s) [2] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770950317s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1a( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843962669s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.771041870s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.18( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174755096s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101852417s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.19( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765346527s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692474365s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.19( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765293121s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692474365s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.18( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765258789s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692474365s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1b( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843996048s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.771209717s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.18( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.765249252s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692474365s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1b( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843981743s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.771209717s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1c( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843798637s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.771087646s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1c( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843787193s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.771087646s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.7( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843589783s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.770950317s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.7( v 53'48 (0'0,53'48] local-lis/les=63/64 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843573570s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.770950317s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1d( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843728065s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.771118164s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1d( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843713760s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.771118164s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1e( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843672752s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active pruub 207.771224976s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764923096s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692474365s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1f( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764910698s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692474365s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[11.1e( v 53'48 (0'0,53'48] local-lis/les=63/64 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66 pruub=15.843659401s) [1] r=-1 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 207.771224976s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764876366s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692535400s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.1c( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764864922s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692535400s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.1d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174166679s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101852417s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.12( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174220085s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101959229s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.12( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174207687s) [1] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101959229s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.1d( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174099922s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101852417s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.13( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174158096s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active pruub 205.101959229s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[9.13( v 49'6 (0'0,49'6] local-lis/les=61/62 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66 pruub=13.174146652s) [2] r=-1 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 205.101959229s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.12( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764698982s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active pruub 203.692626953s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[8.12( v 57'44 (0'0,57'44] local-lis/les=59/61 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66 pruub=11.764680862s) [1] r=-1 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 203.692626953s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.10( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.12( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.6( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.8( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.c( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.b( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.e( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.a( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.1c( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 66 pg[12.19( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.19( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 podman[85467]: 2025-12-01 09:51:39.674816909 +0000 UTC m=+4.101511882 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:51:39 np0005540826 podman[85467]: 2025-12-01 09:51:39.767792832 +0000 UTC m=+4.194487775 container create 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, name=keepalived, io.openshift.expose-services=, com.redhat.component=keepalived-container, version=2.2.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.1c( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.a( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.e( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.8( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.c( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.b( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.6( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.10( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 67 pg[12.12( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [0] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  1 04:51:39 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540826 systemd[1]: Started libpod-conmon-652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e.scope.
Dec  1 04:51:39 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:51:40 np0005540826 podman[85467]: 2025-12-01 09:51:40.080528972 +0000 UTC m=+4.507223925 container init 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-type=git, name=keepalived, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  1 04:51:40 np0005540826 podman[85467]: 2025-12-01 09:51:40.087662195 +0000 UTC m=+4.514357138 container start 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, distribution-scope=public, name=keepalived, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4)
Dec  1 04:51:40 np0005540826 competent_shtern[85562]: 0 0
Dec  1 04:51:40 np0005540826 systemd[1]: libpod-652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e.scope: Deactivated successfully.
Dec  1 04:51:40 np0005540826 podman[85467]: 2025-12-01 09:51:40.133582231 +0000 UTC m=+4.560277204 container attach 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, name=keepalived, io.openshift.tags=Ceph keepalived, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, architecture=x86_64, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, distribution-scope=public, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  1 04:51:40 np0005540826 podman[85467]: 2025-12-01 09:51:40.134580597 +0000 UTC m=+4.561275540 container died 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, io.openshift.expose-services=, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.buildah.version=1.28.2, vendor=Red Hat, Inc., description=keepalived for Ceph, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, distribution-scope=public, name=keepalived, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  1 04:51:40 np0005540826 systemd[1]: var-lib-containers-storage-overlay-e422d12e22e6aab351136333f8151360a71487005f8bc54b03ed204242317b50-merged.mount: Deactivated successfully.
Dec  1 04:51:40 np0005540826 podman[85467]: 2025-12-01 09:51:40.61757938 +0000 UTC m=+5.044274323 container remove 652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e (image=quay.io/ceph/keepalived:2.2.4, name=competent_shtern, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, io.openshift.expose-services=, description=keepalived for Ceph, vendor=Red Hat, Inc., name=keepalived, release=1793)
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Dec  1 04:51:40 np0005540826 systemd[1]: libpod-conmon-652c88130138f75abc2d5b85acf3fba251cb323e92c0364b7b95ed4ececada0e.scope: Deactivated successfully.
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Dec  1 04:51:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.1a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.6( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.2( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 68 pg[10.16( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=68) [0] r=0 lpr=68 pi=[61,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:40 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:40 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:40 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  1 04:51:41 np0005540826 systemd[1]: Reloading.
Dec  1 04:51:41 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:41 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:41 np0005540826 systemd[1]: Reloading.
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Dec  1 04:51:41 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:41 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:41 np0005540826 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.wzwqmm for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.1a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.1a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.2( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.2( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.16( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.16( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.a( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.6( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 69 pg[10.6( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=69) [0]/[1] r=-1 lpr=69 pi=[61,69)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:41 np0005540826 podman[85709]: 2025-12-01 09:51:41.90924605 +0000 UTC m=+0.040365935 container create b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, distribution-scope=public, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived, description=keepalived for Ceph, release=1793, vcs-type=git)
Dec  1 04:51:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf9a83be003a5742eb3defd3205e28a9bdcabc388129bdaed23eb1c0164fe507/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:41 np0005540826 podman[85709]: 2025-12-01 09:51:41.978414052 +0000 UTC m=+0.109533957 container init b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, description=keepalived for Ceph, io.buildah.version=1.28.2)
Dec  1 04:51:41 np0005540826 podman[85709]: 2025-12-01 09:51:41.984160779 +0000 UTC m=+0.115280664 container start b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, release=1793)
Dec  1 04:51:41 np0005540826 bash[85709]: b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf
Dec  1 04:51:41 np0005540826 podman[85709]: 2025-12-01 09:51:41.891968797 +0000 UTC m=+0.023088702 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:51:41 np0005540826 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.wzwqmm for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Starting VRRP child process, pid=4
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: Startup complete
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: (VI_0) Entering BACKUP STATE (init)
Dec  1 04:51:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:42 2025: VRRP_Script(check_backend) succeeded
Dec  1 04:51:42 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.0 deep-scrub starts
Dec  1 04:51:42 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.0 deep-scrub ok
Dec  1 04:51:42 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  1 04:51:42 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec  1 04:51:43 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.c scrub starts
Dec  1 04:51:43 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.c scrub ok
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:43 np0005540826 ceph-mon[80026]: Deploying daemon keepalived.nfs.cephfs.compute-0.gzwexr on compute-0
Dec  1 04:51:44 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.b scrub starts
Dec  1 04:51:44 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.b scrub ok
Dec  1 04:51:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.2( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.2( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 71 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.9 deep-scrub starts
Dec  1 04:51:45 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.9 deep-scrub ok
Dec  1 04:51:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:45 2025: (VI_0) Entering MASTER STATE
Dec  1 04:51:46 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.d scrub starts
Dec  1 04:51:46 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.d scrub ok
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:47 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=5 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.2( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 72 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=69/61 les/c/f=70/62/0 sis=71) [0] r=0 lpr=71 pi=[61,71)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.c scrub starts
Dec  1 04:51:47 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.c scrub ok
Dec  1 04:51:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:48 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:48 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.2 deep-scrub starts
Dec  1 04:51:48 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.2 deep-scrub ok
Dec  1 04:51:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:48 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:49 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.0 scrub starts
Dec  1 04:51:49 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.0 scrub ok
Dec  1 04:51:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:49 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:50 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:50 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1 deep-scrub starts
Dec  1 04:51:50 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1 deep-scrub ok
Dec  1 04:51:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095150 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:51:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:50 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:51 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1 scrub starts
Dec  1 04:51:51 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1 scrub ok
Dec  1 04:51:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:51 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:51 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  1 04:51:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec  1 04:51:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:52 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:52 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Dec  1 04:51:52 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Dec  1 04:51:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:52 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:53 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:53 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:53 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:53 np0005540826 ceph-mon[80026]: Deploying daemon keepalived.nfs.cephfs.compute-2.vkgipv on compute-2
Dec  1 04:51:53 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  1 04:51:53 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Dec  1 04:51:53 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Dec  1 04:51:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:53 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:54 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:54 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Dec  1 04:51:54 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Dec  1 04:51:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec  1 04:51:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:54 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec  1 04:51:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm[85724]: Mon Dec  1 09:51:54 2025: (VI_0) Entering BACKUP STATE
Dec  1 04:51:54 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  1 04:51:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:54 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:55 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Dec  1 04:51:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:55 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:55 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Dec  1 04:51:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:56 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Dec  1 04:51:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  1 04:51:56 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  1 04:51:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:56 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 75 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75) [0] r=0 lpr=75 pi=[70,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 75 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=75) [0] r=0 lpr=75 pi=[69,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 75 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75) [0] r=0 lpr=75 pi=[70,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:56 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 75 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75) [0] r=0 lpr=75 pi=[70,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:57 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Dec  1 04:51:57 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Dec  1 04:51:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:57 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:57 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  1 04:51:57 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  1 04:51:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.030603409s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 active pruub 224.557281494s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.030569077s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 224.557281494s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.029588699s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 active pruub 224.557235718s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.029567719s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 224.557235718s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[69,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[69,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.026332855s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 active pruub 224.554275513s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.026309013s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 224.554275513s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=5 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.028991699s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 active pruub 224.557296753s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=5 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=76 pruub=13.028974533s) [1] r=-1 lpr=76 pi=[71,76)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 224.557296753s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=-1 lpr=76 pi=[70,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:58 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=5 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=5 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 77 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:58 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  1 04:51:59 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:51:59 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=0/0 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78) [0] r=0 lpr=78 pi=[69,78)/1 luod=0'0 crt=71'1025 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=0/0 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78) [0] r=0 lpr=78 pi=[69,78)/1 crt=71'1025 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] async=[1] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] async=[1] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] async=[1] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:59 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 78 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=77) [1]/[0] async=[1] r=0 lpr=77 pi=[71,77)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: Deploying daemon alertmanager.compute-0 on compute-0
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  1 04:52:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:00 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:00 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=4 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.786541939s) [1] async=[1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 56'1015 active pruub 229.089981079s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.16( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=4 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.786443710s) [1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 229.089981079s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.785654068s) [1] async=[1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 56'1015 active pruub 229.090164185s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.785565376s) [1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 229.090164185s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.785223007s) [1] async=[1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 56'1015 active pruub 229.090255737s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.6( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=6 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.785154343s) [1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 229.090255737s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.784566879s) [1] async=[1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 56'1015 active pruub 229.089935303s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/71 les/c/f=78/72/0 sis=79 pruub=14.784517288s) [1] r=-1 lpr=79 pi=[71,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 229.089935303s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=78/79 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78) [0] r=0 lpr=78 pi=[69,78)/1 crt=77'1030 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 79 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78) [0] r=0 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:01 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  1 04:52:01 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:01 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.a scrub starts
Dec  1 04:52:01 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.a scrub ok
Dec  1 04:52:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:01 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:02 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec  1 04:52:02 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.2 deep-scrub starts
Dec  1 04:52:02 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.2 deep-scrub ok
Dec  1 04:52:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:02 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:03 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.15 scrub starts
Dec  1 04:52:03 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.15 scrub ok
Dec  1 04:52:03 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec  1 04:52:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:03 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:04 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:04 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1b scrub starts
Dec  1 04:52:04 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1b scrub ok
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: Regenerating cephadm self-signed grafana TLS certificates
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:04 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec  1 04:52:05 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1a scrub starts
Dec  1 04:52:05 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1a scrub ok
Dec  1 04:52:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:05 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:05 np0005540826 ceph-mon[80026]: Deploying daemon grafana.compute-0 on compute-0
Dec  1 04:52:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec  1 04:52:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:06 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:06 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Dec  1 04:52:06 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Dec  1 04:52:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:06 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:07 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:07 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Dec  1 04:52:07 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Dec  1 04:52:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:07 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:08 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:08 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1f scrub starts
Dec  1 04:52:08 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1f scrub ok
Dec  1 04:52:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:08 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:09 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1e deep-scrub starts
Dec  1 04:52:09 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1e deep-scrub ok
Dec  1 04:52:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:09 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:10 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:10 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1d scrub starts
Dec  1 04:52:10 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.1d scrub ok
Dec  1 04:52:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:10 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:11 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Dec  1 04:52:11 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Dec  1 04:52:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:11 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:12 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.1f scrub starts
Dec  1 04:52:12 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.1f scrub ok
Dec  1 04:52:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:12 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:12 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  1 04:52:12 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec  1 04:52:12 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 84 pg[10.18( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=84) [0] r=0 lpr=84 pi=[61,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:12 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 84 pg[10.8( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=84) [0] r=0 lpr=84 pi=[61,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:12 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.10 scrub starts
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.10 scrub ok
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[61,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[61,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[61,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:13 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[61,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:13 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  1 04:52:14 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.13 scrub starts
Dec  1 04:52:14 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 8.13 scrub ok
Dec  1 04:52:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:14 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec  1 04:52:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:14 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:15 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:15 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:15 np0005540826 ceph-mon[80026]: Deploying daemon haproxy.rgw.default.compute-0.owswdq on compute-0
Dec  1 04:52:15 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  1 04:52:15 np0005540826 systemd-logind[787]: New session 36 of user zuul.
Dec  1 04:52:15 np0005540826 systemd[1]: Started Session 36 of User zuul.
Dec  1 04:52:15 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Dec  1 04:52:15 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Dec  1 04:52:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:15 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Dec  1 04:52:16 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Dec  1 04:52:16 np0005540826 python3.9[85918]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:16 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:16 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.a scrub starts
Dec  1 04:52:17 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.a scrub ok
Dec  1 04:52:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec  1 04:52:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:17 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.1c deep-scrub starts
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.8( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.18( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.8( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.18( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=9 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=87 pruub=8.930681229s) [1] r=-1 lpr=87 pi=[71,87)/1 crt=56'1015 mlcod 0'0 active pruub 240.557510376s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=9 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=87 pruub=8.930647850s) [1] r=-1 lpr=87 pi=[71,87)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 240.557510376s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=87 pruub=8.926884651s) [1] r=-1 lpr=87 pi=[71,87)/1 crt=56'1015 mlcod 0'0 active pruub 240.554718018s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 87 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=87 pruub=8.926853180s) [1] r=-1 lpr=87 pi=[71,87)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 240.554718018s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.1c deep-scrub ok
Dec  1 04:52:18 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  1 04:52:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=9 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=9 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.8( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 88 pg[10.18( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=4 ec=61/50 lis/c=85/61 les/c/f=86/62/0 sis=87) [0] r=0 lpr=87 pi=[61,87)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:18 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:18 np0005540826 python3.9[86134]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:52:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:18 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.003000077s ======
Dec  1 04:52:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:18.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000077s
Dec  1 04:52:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.e scrub starts
Dec  1 04:52:19 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.e scrub ok
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: Deploying daemon haproxy.rgw.default.compute-2.zubkfi on compute-2
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec  1 04:52:19 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 89 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=9 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] async=[1] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:19 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 89 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=88) [1]/[0] async=[1] r=0 lpr=88 pi=[71,88)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:19 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Dec  1 04:52:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:20 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 90 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=9 ec=61/50 lis/c=88/71 les/c/f=89/72/0 sis=90 pruub=14.992375374s) [1] async=[1] r=-1 lpr=90 pi=[71,90)/1 crt=56'1015 mlcod 56'1015 active pruub 248.909484863s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 90 pg[10.a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=9 ec=61/50 lis/c=88/71 les/c/f=89/72/0 sis=90 pruub=14.991620064s) [1] r=-1 lpr=90 pi=[71,90)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 248.909484863s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 90 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=4 ec=61/50 lis/c=88/71 les/c/f=89/72/0 sis=90 pruub=14.994647026s) [1] async=[1] r=-1 lpr=90 pi=[71,90)/1 crt=56'1015 mlcod 56'1015 active pruub 248.912506104s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 90 pg[10.1a( v 56'1015 (0'0,56'1015] local-lis/les=88/89 n=4 ec=61/50 lis/c=88/71 les/c/f=89/72/0 sis=90 pruub=14.994507790s) [1] r=-1 lpr=90 pi=[71,90)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 248.912506104s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:20 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:20.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.b scrub starts
Dec  1 04:52:21 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.b scrub ok
Dec  1 04:52:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec  1 04:52:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:21 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:21.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.6 scrub starts
Dec  1 04:52:22 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.6 scrub ok
Dec  1 04:52:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:22 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:22 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:52:22 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:52:22 np0005540826 ceph-mon[80026]: Deploying daemon keepalived.rgw.default.compute-0.jnboao on compute-0
Dec  1 04:52:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:22 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:22.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Dec  1 04:52:23 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Dec  1 04:52:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:23 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:52:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:23.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Dec  1 04:52:24 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Dec  1 04:52:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:24 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.373890) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744374099, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7579, "num_deletes": 256, "total_data_size": 20885464, "memory_usage": 21806096, "flush_reason": "Manual Compaction"}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744466537, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12787601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 231, "largest_seqno": 7584, "table_properties": {"data_size": 12758168, "index_size": 18566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9797, "raw_key_size": 95821, "raw_average_key_size": 24, "raw_value_size": 12683680, "raw_average_value_size": 3248, "num_data_blocks": 818, "num_entries": 3905, "num_filter_entries": 3905, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 1764582559, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 92676 microseconds, and 40230 cpu microseconds.
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.466594) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12787601 bytes OK
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.466613) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.468703) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.468716) EVENT_LOG_v1 {"time_micros": 1764582744468712, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.468730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20843783, prev total WAL file size 20843783, number of live WAL files 2.
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.472451) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744472621, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12789249, "oldest_snapshot_seqno": -1}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3653 keys, 12784169 bytes, temperature: kUnknown
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744567953, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12784169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12755395, "index_size": 18552, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9157, "raw_key_size": 91619, "raw_average_key_size": 25, "raw_value_size": 12683960, "raw_average_value_size": 3472, "num_data_blocks": 818, "num_entries": 3653, "num_filter_entries": 3653, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.568225) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12784169 bytes
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.569360) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.1 rd, 134.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.2, 0.0 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3910, records dropped: 257 output_compression: NoCompression
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.569377) EVENT_LOG_v1 {"time_micros": 1764582744569369, "job": 4, "event": "compaction_finished", "compaction_time_micros": 95398, "compaction_time_cpu_micros": 49180, "output_level": 6, "num_output_files": 1, "total_output_size": 12784169, "num_input_records": 3910, "num_output_records": 3653, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744571300, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744571341, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:24.472262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:24 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:24.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:25 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:52:25 np0005540826 ceph-mon[80026]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:52:25 np0005540826 ceph-mon[80026]: Deploying daemon keepalived.rgw.default.compute-2.pcdbyn on compute-2
Dec  1 04:52:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.c scrub starts
Dec  1 04:52:25 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 12.c scrub ok
Dec  1 04:52:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:25 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:25.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.d scrub starts
Dec  1 04:52:26 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.d scrub ok
Dec  1 04:52:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540826 ceph-mon[80026]: Deploying daemon prometheus.compute-0 on compute-0
Dec  1 04:52:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:26 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:26 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.5 deep-scrub starts
Dec  1 04:52:27 np0005540826 ceph-osd[77525]: log_channel(cluster) log [DBG] : 10.5 deep-scrub ok
Dec  1 04:52:27 np0005540826 systemd[1]: session-36.scope: Deactivated successfully.
Dec  1 04:52:27 np0005540826 systemd[1]: session-36.scope: Consumed 8.293s CPU time.
Dec  1 04:52:27 np0005540826 systemd-logind[787]: Session 36 logged out. Waiting for processes to exit.
Dec  1 04:52:27 np0005540826 systemd-logind[787]: Removed session 36.
Dec  1 04:52:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:27 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:27.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:28 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:28 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec  1 04:52:28 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  1 04:52:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:28 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:28.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:29 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:29 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:29 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  1 04:52:29 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  1 04:52:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec  1 04:52:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:29.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:30 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:30 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  1 04:52:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:30 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f40091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec  1 04:52:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:30.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:31 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:52:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:31.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:52:31 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec  1 04:52:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:32 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:32 np0005540826 systemd-logind[787]: Session 34 logged out. Waiting for processes to exit.
Dec  1 04:52:32 np0005540826 systemd[1]: session-34.scope: Deactivated successfully.
Dec  1 04:52:32 np0005540826 systemd[1]: session-34.scope: Consumed 19.132s CPU time.
Dec  1 04:52:32 np0005540826 systemd-logind[787]: Removed session 34.
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setuser ceph since I am not root
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: ignoring --setgroup ceph since I am not root
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: pidfile_write: ignore empty --pid-file
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'alerts'
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:32.767+0000 7f7998628140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'balancer'
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:32.860+0000 7f7998628140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'cephadm'
Dec  1 04:52:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:32 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:32.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec  1 04:52:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:33 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f40091b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.703760) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753703840, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 519, "num_deletes": 251, "total_data_size": 1054080, "memory_usage": 1066208, "flush_reason": "Manual Compaction"}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753723014, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 698083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7589, "largest_seqno": 8103, "table_properties": {"data_size": 695126, "index_size": 865, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6844, "raw_average_key_size": 17, "raw_value_size": 688892, "raw_average_value_size": 1770, "num_data_blocks": 37, "num_entries": 389, "num_filter_entries": 389, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582745, "oldest_key_time": 1764582745, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 19318 microseconds, and 3623 cpu microseconds.
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.723054) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 698083 bytes OK
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.723105) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.724931) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.724946) EVENT_LOG_v1 {"time_micros": 1764582753724942, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.724962) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1050844, prev total WAL file size 1050844, number of live WAL files 2.
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.725756) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(681KB)], [15(12MB)]
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753725834, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13482252, "oldest_snapshot_seqno": -1}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3517 keys, 13035451 bytes, temperature: kUnknown
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753823229, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 13035451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13007415, "index_size": 18134, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8837, "raw_key_size": 90858, "raw_average_key_size": 25, "raw_value_size": 12938134, "raw_average_value_size": 3678, "num_data_blocks": 782, "num_entries": 3517, "num_filter_entries": 3517, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.823739) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 13035451 bytes
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.825101) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.3 rd, 133.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(38.0) write-amplify(18.7) OK, records in: 4042, records dropped: 525 output_compression: NoCompression
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.825133) EVENT_LOG_v1 {"time_micros": 1764582753825119, "job": 6, "event": "compaction_finished", "compaction_time_micros": 97494, "compaction_time_cpu_micros": 53565, "output_level": 6, "num_output_files": 1, "total_output_size": 13035451, "num_input_records": 4042, "num_output_records": 3517, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753825486, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753832902, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.725640) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.833161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.833214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.833268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.833315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:52:33.833362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:33.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'crash'
Dec  1 04:52:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:33.938+0000 7f7998628140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:52:33 np0005540826 ceph-mgr[80334]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:52:33 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'dashboard'
Dec  1 04:52:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:34.626+0000 7f7998628140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]:  from numpy import show_config as show_numpy_config
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:34.808+0000 7f7998628140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'influx'
Dec  1 04:52:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:34.880+0000 7f7998628140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'insights'
Dec  1 04:52:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:34 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002cb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:34 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'iostat'
Dec  1 04:52:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:34.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:35.030+0000 7f7998628140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'localpool'
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:52:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:35 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'mirroring'
Dec  1 04:52:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000025s ======
Dec  1 04:52:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:35.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000025s
Dec  1 04:52:35 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'nfs'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.080+0000 7f7998628140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.318+0000 7f7998628140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:36 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.395+0000 7f7998628140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'osd_support'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.471+0000 7f7998628140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.560+0000 7f7998628140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'progress'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.630+0000 7f7998628140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'prometheus'
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:36 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:36.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:36.997+0000 7f7998628140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:52:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:37.109+0000 7f7998628140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540826 ceph-mgr[80334]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'restful'
Dec  1 04:52:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rgw'
Dec  1 04:52:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:37.571+0000 7f7998628140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540826 ceph-mgr[80334]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'rook'
Dec  1 04:52:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:37 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002cb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:37.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.173+0000 7f7998628140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'selftest'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.247+0000 7f7998628140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.333+0000 7f7998628140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'stats'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:38 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'status'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.489+0000 7f7998628140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telegraf'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.566+0000 7f7998628140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'telemetry'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.727+0000 7f7998628140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:38 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:38.954+0000 7f7998628140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'volumes'
Dec  1 04:52:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:38.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:39.235+0000 7f7998628140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: mgr[py] Loading python module 'zabbix'
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 2025-12-01T09:52:39.307+0000 7f7998628140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: mgr load Constructed class from module: dashboard
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: mgr load Constructed class from module: prometheus
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: ms_deliver_dispatch: unhandled message 0x55ce7fb8d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Starting engine...
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO root] Starting engine...
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: CherryPy Checker:
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: The Application mounted at '' has an empty config.
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: 
Dec  1 04:52:39 np0005540826 ceph-mon[80026]: Active manager daemon compute-0.fospow restarted
Dec  1 04:52:39 np0005540826 ceph-mon[80026]: Activating manager daemon compute-0.fospow
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [dashboard INFO root] Engine started...
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-1-ymizfm[80330]: [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec  1 04:52:39 np0005540826 ceph-mgr[80334]: [prometheus INFO root] Engine started.
Dec  1 04:52:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:39 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:39 np0005540826 systemd-logind[787]: New session 37 of user ceph-admin.
Dec  1 04:52:39 np0005540826 systemd[1]: Started Session 37 of User ceph-admin.
Dec  1 04:52:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:39.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:40 np0005540826 ceph-mon[80026]: Manager daemon compute-0.fospow is now available
Dec  1 04:52:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:52:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:52:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:40 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003db0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:40 np0005540826 podman[86386]: 2025-12-01 09:52:40.604462631 +0000 UTC m=+0.088349646 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:52:40 np0005540826 podman[86386]: 2025-12-01 09:52:40.726450115 +0000 UTC m=+0.210337140 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 04:52:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:40 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:40.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:41 np0005540826 podman[86506]: 2025-12-01 09:52:41.231603642 +0000 UTC m=+0.060217654 container exec b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:52:41 np0005540826 podman[86506]: 2025-12-01 09:52:41.266636718 +0000 UTC m=+0.095250710 container exec_died b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:52:41 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  1 04:52:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec  1 04:52:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 99 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=8 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=99 pruub=15.431180000s) [1] r=-1 lpr=99 pi=[78,99)/1 crt=56'1015 mlcod 0'0 active pruub 270.310150146s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 99 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=8 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=99 pruub=15.431142807s) [1] r=-1 lpr=99 pi=[78,99)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 270.310150146s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 99 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=99 pruub=15.430302620s) [1] r=-1 lpr=99 pi=[78,99)/1 crt=56'1015 mlcod 0'0 active pruub 270.310089111s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:41 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 99 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=99 pruub=15.430269241s) [1] r=-1 lpr=99 pi=[78,99)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 270.310089111s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:41 np0005540826 podman[86598]: 2025-12-01 09:52:41.628251408 +0000 UTC m=+0.058318820 container exec 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True)
Dec  1 04:52:41 np0005540826 podman[86598]: 2025-12-01 09:52:41.639418554 +0000 UTC m=+0.069485946 container exec_died 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:52:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:41 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000024s ======
Dec  1 04:52:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:41.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000024s
Dec  1 04:52:41 np0005540826 podman[86663]: 2025-12-01 09:52:41.859715985 +0000 UTC m=+0.070244976 container exec 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 04:52:41 np0005540826 podman[86663]: 2025-12-01 09:52:41.876387528 +0000 UTC m=+0.086916479 container exec_died 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 04:52:42 np0005540826 podman[86732]: 2025-12-01 09:52:42.085662004 +0000 UTC m=+0.054600115 container exec b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, name=keepalived, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  1 04:52:42 np0005540826 podman[86732]: 2025-12-01 09:52:42.127472483 +0000 UTC m=+0.096410624 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, release=1793, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec  1 04:52:42 np0005540826 systemd-logind[787]: New session 38 of user zuul.
Dec  1 04:52:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:42 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:42 np0005540826 systemd[1]: Started Session 38 of User zuul.
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:52:40] ENGINE Bus STARTING
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:52:40] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:52:40] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:52:40] ENGINE Bus STARTED
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: [01/Dec/2025:09:52:40] ENGINE Client ('192.168.122.100', 52120) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  1 04:52:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:42 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:42.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:43 np0005540826 python3.9[86988]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec  1 04:52:43 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 100 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=8 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:43 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 100 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=8 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:43 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 100 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:43 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 100 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:43 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:43.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:43 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec  1 04:52:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:44 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:44 np0005540826 python3.9[87243]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec  1 04:52:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:44 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:44.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 101 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=8 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] async=[1] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 101 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=100) [1]/[0] async=[1] r=0 lpr=100 pi=[78,100)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:45 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:45 np0005540826 python3.9[87399]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  1 04:52:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 102 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=8 ec=61/50 lis/c=100/78 les/c/f=101/79/0 sis=102 pruub=15.292604446s) [1] async=[1] r=-1 lpr=102 pi=[78,102)/1 crt=56'1015 mlcod 56'1015 active pruub 274.513580322s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 102 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=8 ec=61/50 lis/c=100/78 les/c/f=101/79/0 sis=102 pruub=15.292541504s) [1] r=-1 lpr=102 pi=[78,102)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 274.513580322s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 102 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=5 ec=61/50 lis/c=100/78 les/c/f=101/79/0 sis=102 pruub=15.291830063s) [1] async=[1] r=-1 lpr=102 pi=[78,102)/1 crt=56'1015 mlcod 56'1015 active pruub 274.513610840s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:45 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 102 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=100/101 n=5 ec=61/50 lis/c=100/78 les/c/f=101/79/0 sis=102 pruub=15.291762352s) [1] r=-1 lpr=102 pi=[78,102)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 274.513610840s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:45.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:46 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec  1 04:52:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:46 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:46 np0005540826 python3.9[87887]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:52:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 04:52:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:46.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 04:52:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:47 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:47.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:47 np0005540826 python3.9[88401]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:52:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:48 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:48 np0005540826 python3.9[88752]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:52:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:48 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e40023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:48.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:49 np0005540826 python3.9[88902]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:52:49 np0005540826 network[88919]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:52:49 np0005540826 network[88920]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:52:49 np0005540826 network[88921]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:52:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:49 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 04:52:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:49.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 04:52:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec  1 04:52:50 np0005540826 ceph-mon[80026]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540826 ceph-mon[80026]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540826 ceph-mon[80026]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  1 04:52:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:50 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:50 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:50.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:52:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec  1 04:52:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:51 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:51.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  1 04:52:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec  1 04:52:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:52 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00029f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:52 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:52.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec  1 04:52:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:53 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:53.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:54 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec  1 04:52:54 np0005540826 python3.9[89186]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:52:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:54 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00029f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:54.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:55 np0005540826 python3.9[89336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:55 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:55.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:56 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:56 np0005540826 python3.9[89541]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: Reconfiguring mon.compute-0 (monmap changed)...
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:52:56 np0005540826 ceph-mon[80026]: Reconfiguring daemon mon.compute-0 on compute-0
Dec  1 04:52:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:56 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:56.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:57 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00029f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:57.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: Reconfiguring mgr.compute-0.fospow (monmap changed)...
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.fospow", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: Reconfiguring daemon mgr.compute-0.fospow on compute-0
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: Reconfiguring crash.compute-0 (monmap changed)...
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: Reconfiguring daemon crash.compute-0 on compute-0
Dec  1 04:52:57 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec  1 04:52:57 np0005540826 python3.9[89699]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:52:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:58 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:58 np0005540826 python3.9[89784]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: Reconfiguring osd.1 (monmap changed)...
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  1 04:52:58 np0005540826 ceph-mon[80026]: Reconfiguring daemon osd.1 on compute-0
Dec  1 04:52:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:58 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:59.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:52:59 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:52:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:59.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:00 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00029f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec  1 04:53:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec  1 04:53:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 110 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=110 pruub=14.376198769s) [2] r=-1 lpr=110 pi=[71,110)/1 crt=56'1015 mlcod 0'0 active pruub 288.559417725s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:00 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 110 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=110 pruub=14.376149178s) [2] r=-1 lpr=110 pi=[71,110)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 288.559417725s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:00 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:01.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:01 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec  1 04:53:01 np0005540826 ceph-mon[80026]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec  1 04:53:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:01.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec  1 04:53:02 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 111 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=111) [0] r=0 lpr=111 pi=[69,111)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:02 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 111 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=0 lpr=111 pi=[71,111)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:02 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 111 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=71/72 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=0 lpr=111 pi=[71,111)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:02 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:02 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:03.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec  1 04:53:03 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 112 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=-1 lpr=112 pi=[69,112)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:03 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 112 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=-1 lpr=112 pi=[69,112)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec  1 04:53:03 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 112 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=111/112 n=4 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] async=[2] r=0 lpr=111 pi=[71,111)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:03 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:03.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec  1 04:53:04 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 113 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:04 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=111/112 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113 pruub=14.794373512s) [2] async=[2] r=-1 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 56'1015 active pruub 292.453704834s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:04 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=111/112 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113 pruub=14.794007301s) [2] r=-1 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 292.453704834s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:04 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec  1 04:53:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:04 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:04 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:05.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec  1 04:53:05 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114) [0] r=0 lpr=114 pi=[69,114)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:05 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 114 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=-1 lpr=114 pi=[77,114)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:05 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114) [0] r=0 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:05 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 114 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=-1 lpr=114 pi=[77,114)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec  1 04:53:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec  1 04:53:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:05 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:05.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:06 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec  1 04:53:06 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 115 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114) [0] r=0 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:06 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:06 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:06 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec  1 04:53:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:06 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:07.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:07 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:07 np0005540826 ceph-mon[80026]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec  1 04:53:07 np0005540826 ceph-mon[80026]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec  1 04:53:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec  1 04:53:07 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116) [0] r=0 lpr=116 pi=[77,116)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:07 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116) [0] r=0 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:07.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:08 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:08 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec  1 04:53:09 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 117 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=116/117 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116) [0] r=0 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:09.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.5245266 +0000 UTC m=+0.077889598 container create df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:53:09 np0005540826 systemd[81783]: Starting Mark boot as successful...
Dec  1 04:53:09 np0005540826 systemd[1]: Started libpod-conmon-df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91.scope.
Dec  1 04:53:09 np0005540826 systemd[81783]: Finished Mark boot as successful.
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.469233321 +0000 UTC m=+0.022596309 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:09 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.617919557 +0000 UTC m=+0.171282545 container init df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.627949839 +0000 UTC m=+0.181312797 container start df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.632466882 +0000 UTC m=+0.185829860 container attach df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:53:09 np0005540826 systemd[1]: libpod-df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91.scope: Deactivated successfully.
Dec  1 04:53:09 np0005540826 hardcore_mcclintock[89941]: 167 167
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.637844847 +0000 UTC m=+0.191207805 container died df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:53:09 np0005540826 conmon[89941]: conmon df09a310a9aad4ac75f3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91.scope/container/memory.events
Dec  1 04:53:09 np0005540826 systemd[1]: var-lib-containers-storage-overlay-6f38380b04ef39d8dd9df8dd967af46291411fbca4ee3db9ab8f16d8a8a8d94b-merged.mount: Deactivated successfully.
Dec  1 04:53:09 np0005540826 podman[89924]: 2025-12-01 09:53:09.674821446 +0000 UTC m=+0.228184404 container remove df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_mcclintock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:53:09 np0005540826 systemd[1]: libpod-conmon-df09a310a9aad4ac75f3d7e279aa84a41093423b48846040ed67b6bb77befe91.scope: Deactivated successfully.
Dec  1 04:53:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:09 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:09.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: Reconfiguring crash.compute-1 (monmap changed)...
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: Reconfiguring daemon crash.compute-1 on compute-1
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  1 04:53:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:10 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f400a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.326147042 +0000 UTC m=+0.022584628 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.492724068 +0000 UTC m=+0.189161634 container create 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 04:53:10 np0005540826 systemd[1]: Started libpod-conmon-4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa.scope.
Dec  1 04:53:10 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.858281863 +0000 UTC m=+0.554719459 container init 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.86492813 +0000 UTC m=+0.561365696 container start 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  1 04:53:10 np0005540826 festive_pascal[90043]: 167 167
Dec  1 04:53:10 np0005540826 systemd[1]: libpod-4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa.scope: Deactivated successfully.
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.883332883 +0000 UTC m=+0.579770469 container attach 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:53:10 np0005540826 podman[90027]: 2025-12-01 09:53:10.883994319 +0000 UTC m=+0.580431885 container died 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec  1 04:53:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:10 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:11.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:11 np0005540826 systemd[1]: var-lib-containers-storage-overlay-c435cb5c2ae0aa83ae9e4ccf7d09b805dcbbbf31e802e2d3072269bfc3767d9c-merged.mount: Deactivated successfully.
Dec  1 04:53:11 np0005540826 podman[90027]: 2025-12-01 09:53:11.211773055 +0000 UTC m=+0.908210621 container remove 4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_pascal, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:53:11 np0005540826 systemd[1]: libpod-conmon-4b60cab5f22ab92b568b9171e33c6a8b3f09a3b669bb408edfc38772014a0ffa.scope: Deactivated successfully.
Dec  1 04:53:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec  1 04:53:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:11 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:11 np0005540826 ceph-mon[80026]: Reconfiguring osd.0 (monmap changed)...
Dec  1 04:53:11 np0005540826 ceph-mon[80026]: Reconfiguring daemon osd.0 on compute-1
Dec  1 04:53:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec  1 04:53:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:11.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.357921964 +0000 UTC m=+0.037556034 container create cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec  1 04:53:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:12 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:12 np0005540826 systemd[1]: Started libpod-conmon-cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa.scope.
Dec  1 04:53:12 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.431820931 +0000 UTC m=+0.111455021 container init cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.342503597 +0000 UTC m=+0.022137687 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.43894662 +0000 UTC m=+0.118580690 container start cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.442299704 +0000 UTC m=+0.121933794 container attach cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  1 04:53:12 np0005540826 xenodochial_nightingale[90154]: 167 167
Dec  1 04:53:12 np0005540826 systemd[1]: libpod-cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa.scope: Deactivated successfully.
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.4453045 +0000 UTC m=+0.124938580 container died cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:53:12 np0005540826 systemd[1]: var-lib-containers-storage-overlay-84c30a1925811e313e6c4be784b6cafffd6fba7fe821f82fc3a8eea05a8b9fc3-merged.mount: Deactivated successfully.
Dec  1 04:53:12 np0005540826 podman[90138]: 2025-12-01 09:53:12.476654028 +0000 UTC m=+0.156288098 container remove cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=xenodochial_nightingale, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:53:12 np0005540826 systemd[1]: libpod-conmon-cad405dbc35f14f8af9fd41cb02034895ca066e4ef85836bd5184f56e3b8feaa.scope: Deactivated successfully.
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: Reconfiguring mon.compute-1 (monmap changed)...
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: Reconfiguring daemon mon.compute-1 on compute-1
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:53:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:12 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f400a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:13.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:13 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: Reconfiguring mon.compute-2 (monmap changed)...
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: Reconfiguring daemon mon.compute-2 on compute-2
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: Reconfiguring mgr.compute-2.kdtkls (monmap changed)...
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: Reconfiguring daemon mgr.compute-2.kdtkls on compute-2
Dec  1 04:53:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:13.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:13 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec  1 04:53:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:14 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:14 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:15.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: Reconfiguring osd.2 (unknown last config time)...
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: Reconfiguring daemon osd.2 on compute-2
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:15 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f400a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:15.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec  1 04:53:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec  1 04:53:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec  1 04:53:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[85336]: 01/12/2025 09:53:16 : epoch 692d6526 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0cc003a20 fd 39 proxy ignored for local
Dec  1 04:53:16 np0005540826 kernel: ganesha.nfsd[88933]: segfault at 50 ip 00007fd1a860e32e sp 00007fd167ffe210 error 4 in libntirpc.so.5.8[7fd1a85f3000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 04:53:16 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:53:16 np0005540826 systemd[1]: Started Process Core Dump (PID 90232/UID 0).
Dec  1 04:53:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:17.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec  1 04:53:17 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 121 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=89/89 les/c/f=90/90/0 sis=121) [0] r=0 lpr=121 pi=[89,121)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:17 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec  1 04:53:17 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec  1 04:53:17 np0005540826 systemd-coredump[90233]: Process 85340 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 60:#012#0  0x00007fd1a860e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:53:17 np0005540826 systemd[1]: systemd-coredump@1-90232-0.service: Deactivated successfully.
Dec  1 04:53:17 np0005540826 systemd[1]: systemd-coredump@1-90232-0.service: Consumed 1.243s CPU time.
Dec  1 04:53:17 np0005540826 podman[90246]: 2025-12-01 09:53:17.869607713 +0000 UTC m=+0.040195881 container died 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:53:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:17.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:17 np0005540826 systemd[1]: var-lib-containers-storage-overlay-db5e876a289a7c8f96978333827b4cee0e52c61705fee3fbba3b464c4e2cfdd5-merged.mount: Deactivated successfully.
Dec  1 04:53:17 np0005540826 podman[90246]: 2025-12-01 09:53:17.990856209 +0000 UTC m=+0.161444317 container remove 470db443dfbb44b498655286f13305e4bbfd9663e0774e5d00795d81ba1e8b0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:53:17 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:53:18 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 04:53:18 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.766s CPU time.
Dec  1 04:53:18 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec  1 04:53:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec  1 04:53:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 122 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=89/89 les/c/f=90/90/0 sis=122) [0]/[1] r=-1 lpr=122 pi=[89,122)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:18 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 122 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=89/89 les/c/f=90/90/0 sis=122) [0]/[1] r=-1 lpr=122 pi=[89,122)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:19.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec  1 04:53:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:19.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec  1 04:53:20 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec  1 04:53:20 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:20 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:53:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 124 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=122/89 les/c/f=123/90/0 sis=124) [0] r=0 lpr=124 pi=[89,124)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:20 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 124 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=122/89 les/c/f=123/90/0 sis=124) [0] r=0 lpr=124 pi=[89,124)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:21.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:21 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec  1 04:53:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec  1 04:53:21 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 125 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=95/95 les/c/f=96/96/0 sis=125) [0] r=0 lpr=125 pi=[95,125)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:21.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:21 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 125 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=124/125 n=7 ec=61/50 lis/c=122/89 les/c/f=123/90/0 sis=124) [0] r=0 lpr=124 pi=[89,124)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:22 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec  1 04:53:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095322 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:53:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 126 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=95/95 les/c/f=96/96/0 sis=126) [0]/[1] r=-1 lpr=126 pi=[95,126)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:22 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 126 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=95/95 les/c/f=96/96/0 sis=126) [0]/[1] r=-1 lpr=126 pi=[95,126)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec  1 04:53:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:23.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:23.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:23 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec  1 04:53:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec  1 04:53:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec  1 04:53:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec  1 04:53:25 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 128 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=126/95 les/c/f=127/96/0 sis=128) [0] r=0 lpr=128 pi=[95,128)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:25 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 128 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=126/95 les/c/f=127/96/0 sis=128) [0] r=0 lpr=128 pi=[95,128)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:25.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:25.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:26 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec  1 04:53:26 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:26 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec  1 04:53:26 np0005540826 ceph-osd[77525]: osd.0 pg_epoch: 129 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=128/129 n=2 ec=61/50 lis/c=126/95 les/c/f=127/96/0 sis=128) [0] r=0 lpr=128 pi=[95,128)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:27.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:27 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec  1 04:53:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:27.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:28 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 2.
Dec  1 04:53:28 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:53:28 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.766s CPU time.
Dec  1 04:53:28 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:53:28 np0005540826 podman[90400]: 2025-12-01 09:53:28.609752875 +0000 UTC m=+0.044129480 container create fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  1 04:53:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/707dd3fa0ff785c74b1ccd7f85a182fa688f9846b64ad23c433dd0c12365a11c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:53:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/707dd3fa0ff785c74b1ccd7f85a182fa688f9846b64ad23c433dd0c12365a11c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:53:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/707dd3fa0ff785c74b1ccd7f85a182fa688f9846b64ad23c433dd0c12365a11c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:53:28 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/707dd3fa0ff785c74b1ccd7f85a182fa688f9846b64ad23c433dd0c12365a11c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:53:28 np0005540826 podman[90400]: 2025-12-01 09:53:28.668402079 +0000 UTC m=+0.102778704 container init fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  1 04:53:28 np0005540826 podman[90400]: 2025-12-01 09:53:28.673948738 +0000 UTC m=+0.108325343 container start fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:53:28 np0005540826 bash[90400]: fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0
Dec  1 04:53:28 np0005540826 podman[90400]: 2025-12-01 09:53:28.591282301 +0000 UTC m=+0.025658916 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:53:28 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:53:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:53:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:29.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:29.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:31.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec  1 04:53:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec  1 04:53:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:31.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec  1 04:53:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec  1 04:53:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:33.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:33 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec  1 04:53:33 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:53:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:33.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:33 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec  1 04:53:34 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:53:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:34 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:53:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:34 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:53:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec  1 04:53:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:35.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:35.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec  1 04:53:37 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec  1 04:53:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:37.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:37.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:39.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:39.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:53:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:41.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:41 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90000fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:41.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:42 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095342 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:53:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:42 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:43.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:43 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:43.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:44 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:44 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:45.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:45 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c780016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:45.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:46 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:46 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:47.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:48 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c780016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:48 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:49.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:53:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:49.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:53:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:50 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:50 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c780016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:51.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:51 np0005540826 python3.9[90659]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:53:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:51 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:53:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:51.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:53:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:52 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:52 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:53.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:53 np0005540826 python3.9[90947]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:53:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:53 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:53.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:54 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:54 np0005540826 python3.9[91100]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:53:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:54 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:55 np0005540826 python3.9[91252]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:53:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:55.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:55.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:56 np0005540826 python3.9[91404]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:53:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:56 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:56 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:57 np0005540826 python3.9[91582]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:53:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:57 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:57.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:58 np0005540826 python3.9[91734]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:53:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:58 np0005540826 python3.9[91813]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:53:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:59.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:53:59 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:53:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:59.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:00 np0005540826 python3.9[91965]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:00 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:00 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:01.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:01 np0005540826 python3.9[92120]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:54:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:01.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:02 np0005540826 python3.9[92273]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:54:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:02 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:02 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:03 np0005540826 python3.9[92427]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:54:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:03.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:03 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:03.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:04 np0005540826 python3.9[92579]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:54:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:05 np0005540826 python3.9[92732]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:05 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:05.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:06 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:06 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:07.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:07 np0005540826 python3.9[92886]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:08 np0005540826 python3.9[93038]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:54:08 np0005540826 python3.9[93117]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:08 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:08 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:09.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:09 np0005540826 python3.9[93269]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:54:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:09 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca80021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:09 np0005540826 python3.9[93347]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:10 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:10 np0005540826 python3.9[93500]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:10 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:11.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:11 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:12.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:12 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca80021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:12 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:13.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:13 np0005540826 python3.9[93653]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:13 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:14.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:14 np0005540826 python3.9[93805]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:54:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:14 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:14 np0005540826 python3.9[93956]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:15 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca80021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:15.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:15 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:16.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:16 np0005540826 python3.9[94109]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:16 np0005540826 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:54:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:16 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:16 np0005540826 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:54:16 np0005540826 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:54:16 np0005540826 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:54:16 np0005540826 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:54:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:17 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:17.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:17 np0005540826 python3.9[94297]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:54:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:17 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:18.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:18 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:19 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:19.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:19 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:20.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:20 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca80095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:21 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:21 np0005540826 python3.9[94451]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:21.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:21 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:21 np0005540826 python3.9[94605]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:22 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:22 np0005540826 systemd[1]: session-38.scope: Deactivated successfully.
Dec  1 04:54:22 np0005540826 systemd[1]: session-38.scope: Consumed 1min 4.953s CPU time.
Dec  1 04:54:22 np0005540826 systemd-logind[787]: Session 38 logged out. Waiting for processes to exit.
Dec  1 04:54:22 np0005540826 systemd-logind[787]: Removed session 38.
Dec  1 04:54:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:23 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:23.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:23 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:24 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:25 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:25.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:25 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:26.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:26 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:27 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:27.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:27 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:28 np0005540826 systemd-logind[787]: New session 39 of user zuul.
Dec  1 04:54:28 np0005540826 systemd[1]: Started Session 39 of User zuul.
Dec  1 04:54:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:29 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:29.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:29 np0005540826 python3.9[94870]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:29 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:30.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:54:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:30 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:30 np0005540826 python3.9[95027]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:54:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:31 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:31.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:31 np0005540826 python3.9[95180]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:54:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:31 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:32.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:32 np0005540826 python3.9[95265]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:54:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:32 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:33 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:33.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:33 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:34.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:34 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:34 np0005540826 python3.9[95419]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:35 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:35.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:35 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:35 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:35 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:36.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:36 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:37 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:37.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:37 np0005540826 python3.9[95625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:54:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:37 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:38.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:38 np0005540826 python3.9[95779]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:38 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c001040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:39 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:39.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.355561) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879355643, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2555, "num_deletes": 252, "total_data_size": 9923008, "memory_usage": 10223424, "flush_reason": "Manual Compaction"}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec  1 04:54:39 np0005540826 python3.9[95931]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879383128, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6164538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8108, "largest_seqno": 10658, "table_properties": {"data_size": 6153332, "index_size": 7252, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 24268, "raw_average_key_size": 21, "raw_value_size": 6130188, "raw_average_value_size": 5330, "num_data_blocks": 321, "num_entries": 1150, "num_filter_entries": 1150, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582754, "oldest_key_time": 1764582754, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 27637 microseconds, and 11955 cpu microseconds.
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.383203) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6164538 bytes OK
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.383228) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.384742) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.384760) EVENT_LOG_v1 {"time_micros": 1764582879384755, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.384778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9910955, prev total WAL file size 9910955, number of live WAL files 2.
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.387301) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6020KB)], [18(12MB)]
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879387352, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19199989, "oldest_snapshot_seqno": -1}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4131 keys, 14810333 bytes, temperature: kUnknown
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879465495, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14810333, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14776837, "index_size": 22058, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105306, "raw_average_key_size": 25, "raw_value_size": 14695317, "raw_average_value_size": 3557, "num_data_blocks": 945, "num_entries": 4131, "num_filter_entries": 4131, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.465765) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14810333 bytes
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.466777) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 245.3 rd, 189.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.9, 12.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.5) write-amplify(2.4) OK, records in: 4667, records dropped: 536 output_compression: NoCompression
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.466794) EVENT_LOG_v1 {"time_micros": 1764582879466785, "job": 8, "event": "compaction_finished", "compaction_time_micros": 78261, "compaction_time_cpu_micros": 32808, "output_level": 6, "num_output_files": 1, "total_output_size": 14810333, "num_input_records": 4667, "num_output_records": 4131, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879467883, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879469800, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.387235) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.469889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.469894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.469895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.469898) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:54:39.469899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:39 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:40.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:40 np0005540826 python3.9[96082]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:41 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:41.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:41 np0005540826 python3.9[96240]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:41 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c001040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:42 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:43 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:43.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:43 np0005540826 systemd[1]: session-19.scope: Deactivated successfully.
Dec  1 04:54:43 np0005540826 systemd[1]: session-19.scope: Consumed 8.839s CPU time.
Dec  1 04:54:43 np0005540826 systemd-logind[787]: Session 19 logged out. Waiting for processes to exit.
Dec  1 04:54:43 np0005540826 systemd-logind[787]: Removed session 19.
Dec  1 04:54:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:43 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:43 np0005540826 python3.9[96394]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:54:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:44.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:44 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:45 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:45.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:45 np0005540826 python3.9[96682]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:54:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:45 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:46.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:46 np0005540826 python3.9[96833]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:46 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:47 np0005540826 python3.9[96987]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:54:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:47.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:54:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c0011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:48.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:48 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:49.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:49 np0005540826 python3.9[97141]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:50.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:50 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c003770 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:51 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:51.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:51 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:51 np0005540826 python3.9[97295]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:52 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:52 np0005540826 python3.9[97450]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec  1 04:54:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:53 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c003770 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:53.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095453 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:54:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:53 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:54 np0005540826 systemd[1]: session-39.scope: Deactivated successfully.
Dec  1 04:54:54 np0005540826 systemd[1]: session-39.scope: Consumed 18.488s CPU time.
Dec  1 04:54:54 np0005540826 systemd-logind[787]: Session 39 logged out. Waiting for processes to exit.
Dec  1 04:54:54 np0005540826 systemd-logind[787]: Removed session 39.
Dec  1 04:54:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:54.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:54 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:55.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c004480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:56.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:56 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:57 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:54:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:57.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:54:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:57 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:58.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c004480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:59 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:54:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:59.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:59 np0005540826 systemd-logind[787]: New session 40 of user zuul.
Dec  1 04:54:59 np0005540826 systemd[1]: Started Session 40 of User zuul.
Dec  1 04:54:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:54:59 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:00.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:00 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:00 np0005540826 python3.9[97658]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c7c004480 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:01.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:01 np0005540826 python3.9[97812]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:02.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:02 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:03 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:03 np0005540826 python3.9[98006]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:03.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:03 np0005540826 systemd[1]: session-40.scope: Deactivated successfully.
Dec  1 04:55:03 np0005540826 systemd[1]: session-40.scope: Consumed 2.433s CPU time.
Dec  1 04:55:03 np0005540826 systemd-logind[787]: Session 40 logged out. Waiting for processes to exit.
Dec  1 04:55:03 np0005540826 systemd-logind[787]: Removed session 40.
Dec  1 04:55:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:03 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:05 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:05.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:05 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:06.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:06 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:07.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:55:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:08.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:08 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:09 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:09.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:09 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:10.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:10 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:11 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0003150 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:11.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:11 np0005540826 systemd-logind[787]: New session 41 of user zuul.
Dec  1 04:55:11 np0005540826 systemd[1]: Started Session 41 of User zuul.
Dec  1 04:55:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:11 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:12.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:12 np0005540826 python3.9[98193]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:12 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:13 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c700016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:13.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:13 np0005540826 python3.9[98347]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095513 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:55:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:13 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 04:55:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 04:55:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:14 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:14 np0005540826 python3.9[98504]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:15 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:15.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:15 np0005540826 python3.9[98588]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:15 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c700016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:16.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:16 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:17 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:17.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:17 np0005540826 python3.9[98767]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:17 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:18 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c700016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:18 np0005540826 python3.9[98963]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:19 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:19.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:19 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:19 np0005540826 python3.9[99115]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:20 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:20 np0005540826 python3.9[99281]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:21 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:21.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:21 np0005540826 python3.9[99359]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:21 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:22 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:22 np0005540826 python3.9[99512]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:22 np0005540826 python3.9[99590]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:23 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:23.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:23 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:24 np0005540826 python3.9[99742]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:24.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:24 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca0001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:24 np0005540826 python3.9[99895]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:25 np0005540826 irqbalance[785]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  1 04:55:25 np0005540826 irqbalance[785]: IRQ 26 affinity is now unmanaged
Dec  1 04:55:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:25 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:25 np0005540826 python3.9[100047]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:25 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78002d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:25 np0005540826 python3.9[100199]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:26 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:26 np0005540826 python3.9[100352]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:27 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:27.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:27 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:28.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:28 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:29 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:29 np0005540826 python3.9[100506]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:29 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:30.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:30 np0005540826 python3.9[100661]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:55:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:30 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:31 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:31 np0005540826 python3.9[100813]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:55:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:31.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:31 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:32 np0005540826 python3.9[100965]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:32.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:32 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:33 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:33 np0005540826 python3.9[101119]: ansible-service_facts Invoked
Dec  1 04:55:33 np0005540826 network[101136]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:55:33 np0005540826 network[101137]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:55:33 np0005540826 network[101138]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:55:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:33.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:33 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:34.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:34 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:34 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:35 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:35.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:35 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:36.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095536 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:55:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:36 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:37 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:37.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:37 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:38.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:38 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:39 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:39.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:55:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:39 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:40.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:40 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:40 np0005540826 python3.9[101701]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:40 np0005540826 ceph-mon[80026]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  1 04:55:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:41 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:41.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:41 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:42.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:42 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:43 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:43.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:43 np0005540826 python3.9[101855]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:55:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095543 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:55:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:43 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:44.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:44 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:44 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:44 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:44 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:44 np0005540826 python3.9[102033]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:45 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:45.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:45 np0005540826 python3.9[102111]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:45 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:46.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:46 np0005540826 python3.9[102264]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:46 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:46 np0005540826 python3.9[102342]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:47.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:47 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:48.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:48 np0005540826 python3.9[102495]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:48 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:55:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:49.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:55:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:49 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:50 np0005540826 python3.9[102647]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:50.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:50 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:51 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:51 np0005540826 python3.9[102732]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:55:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:55:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:51.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:55:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:51 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:52 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:55:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:52.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:52 np0005540826 systemd[1]: session-41.scope: Deactivated successfully.
Dec  1 04:55:52 np0005540826 systemd[1]: session-41.scope: Consumed 24.430s CPU time.
Dec  1 04:55:52 np0005540826 systemd-logind[787]: Session 41 logged out. Waiting for processes to exit.
Dec  1 04:55:52 np0005540826 systemd-logind[787]: Removed session 41.
Dec  1 04:55:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:52 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:53 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:53.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:53 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:54.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:54 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:55:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:55:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:55 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:56.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:56 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:57 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:57 np0005540826 systemd-logind[787]: New session 42 of user zuul.
Dec  1 04:55:57 np0005540826 systemd[1]: Started Session 42 of User zuul.
Dec  1 04:55:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:57 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78003650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:58.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095558 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:55:58 np0005540826 python3.9[102943]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:58 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:59 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:55:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:59.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:59 np0005540826 python3.9[103095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:55:59 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:59 np0005540826 python3.9[103173]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:00.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:00 np0005540826 systemd[1]: session-42.scope: Deactivated successfully.
Dec  1 04:56:00 np0005540826 systemd[1]: session-42.scope: Consumed 1.603s CPU time.
Dec  1 04:56:00 np0005540826 systemd-logind[787]: Session 42 logged out. Waiting for processes to exit.
Dec  1 04:56:00 np0005540826 systemd-logind[787]: Removed session 42.
Dec  1 04:56:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:00 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78004360 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:01.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:56:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:01 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:02.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:02 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:03 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:03.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095603 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:56:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:03 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78004360 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:04.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:04 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:05 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:05.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:05 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:06 np0005540826 systemd-logind[787]: New session 43 of user zuul.
Dec  1 04:56:06 np0005540826 systemd[1]: Started Session 43 of User zuul.
Dec  1 04:56:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:06.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:06 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78004360 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:07 np0005540826 python3.9[103355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:56:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:07.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:07 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:08.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:08 np0005540826 python3.9[103512]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:08 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:09 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c78004360 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:09 np0005540826 python3.9[103687]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:09.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:09 np0005540826 python3.9[103765]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.dx5r28_u recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:09 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c90003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:10.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:10 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:10 np0005540826 python3.9[103919]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:11 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca00040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:11 np0005540826 python3.9[103998]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.zjsqubcb recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:11.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:11 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c70001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:12.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:12 np0005540826 python3.9[104151]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:12 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3ca8001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:13 np0005540826 python3.9[104303]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[90415]: 01/12/2025 09:56:13 : epoch 692d6598 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3c84004500 fd 39 proxy ignored for local
Dec  1 04:56:13 np0005540826 kernel: ganesha.nfsd[101623]: segfault at 50 ip 00007f3d5437932e sp 00007f3d14ff8210 error 4 in libntirpc.so.5.8[7f3d5435e000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 04:56:13 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:56:13 np0005540826 systemd[1]: Started Process Core Dump (PID 104306/UID 0).
Dec  1 04:56:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:13.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:13 np0005540826 python3.9[104383]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:14.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:14 np0005540826 python3.9[104536]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:14 np0005540826 systemd-coredump[104307]: Process 90419 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 61:#012#0  0x00007f3d5437932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:56:14 np0005540826 systemd[1]: systemd-coredump@2-104306-0.service: Deactivated successfully.
Dec  1 04:56:14 np0005540826 systemd[1]: systemd-coredump@2-104306-0.service: Consumed 1.338s CPU time.
Dec  1 04:56:14 np0005540826 podman[104619]: 2025-12-01 09:56:14.607061822 +0000 UTC m=+0.038088260 container died fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:56:14 np0005540826 systemd[1]: var-lib-containers-storage-overlay-707dd3fa0ff785c74b1ccd7f85a182fa688f9846b64ad23c433dd0c12365a11c-merged.mount: Deactivated successfully.
Dec  1 04:56:14 np0005540826 systemd[81783]: Created slice User Background Tasks Slice.
Dec  1 04:56:14 np0005540826 systemd[81783]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 04:56:14 np0005540826 systemd[81783]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 04:56:14 np0005540826 podman[104619]: 2025-12-01 09:56:14.670687364 +0000 UTC m=+0.101713772 container remove fd0d7f9278875d4f993730918493cd7fa0a14e8c8961fe6a62e78c0bac5c3aa0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:56:14 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:56:14 np0005540826 python3.9[104615]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:14 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 04:56:14 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.900s CPU time.
Dec  1 04:56:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:15.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:15 np0005540826 python3.9[104810]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:16.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:16 np0005540826 python3.9[104963]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:16 np0005540826 python3.9[105041]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:17 np0005540826 python3.9[105218]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:18.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:18 np0005540826 python3.9[105297]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095619 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:56:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:19 np0005540826 python3.9[105449]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:56:19 np0005540826 systemd[1]: Reloading.
Dec  1 04:56:19 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:56:19 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:56:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:20.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:20 np0005540826 python3.9[105640]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:21 np0005540826 python3.9[105718]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:21.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:22 np0005540826 python3.9[105870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:22.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:22 np0005540826 python3.9[105949]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:23.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:23 np0005540826 python3.9[106101]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:56:23 np0005540826 systemd[1]: Reloading.
Dec  1 04:56:23 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:56:23 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:56:23 np0005540826 systemd[1]: Starting Create netns directory...
Dec  1 04:56:23 np0005540826 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:56:23 np0005540826 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:56:23 np0005540826 systemd[1]: Finished Create netns directory.
Dec  1 04:56:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:24.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:24 np0005540826 python3.9[106293]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:56:24 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 3.
Dec  1 04:56:24 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:56:24 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.900s CPU time.
Dec  1 04:56:24 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:56:24 np0005540826 network[106312]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:56:24 np0005540826 network[106314]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:56:24 np0005540826 network[106315]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:56:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:25 np0005540826 podman[106362]: 2025-12-01 09:56:25.147520472 +0000 UTC m=+0.042641635 container create 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec  1 04:56:25 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffa527e6adc1af04ac0a0706dd9f2618ef1e4a2193605a121df13301dc85fe8d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:56:25 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffa527e6adc1af04ac0a0706dd9f2618ef1e4a2193605a121df13301dc85fe8d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:56:25 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffa527e6adc1af04ac0a0706dd9f2618ef1e4a2193605a121df13301dc85fe8d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:56:25 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ffa527e6adc1af04ac0a0706dd9f2618ef1e4a2193605a121df13301dc85fe8d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:56:25 np0005540826 podman[106362]: 2025-12-01 09:56:25.218509289 +0000 UTC m=+0.113630472 container init 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:56:25 np0005540826 podman[106362]: 2025-12-01 09:56:25.224018788 +0000 UTC m=+0.119139951 container start 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:56:25 np0005540826 podman[106362]: 2025-12-01 09:56:25.128349469 +0000 UTC m=+0.023470652 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:56:25 np0005540826 bash[106362]: 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:56:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:56:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:25.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:25 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:56:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:26.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:27.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:28.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:29.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:30.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:31 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:56:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:31 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:56:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:31.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:32 np0005540826 python3.9[106678]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:32.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:32 np0005540826 python3.9[106757]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:33 np0005540826 python3.9[106909]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:34.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:34 np0005540826 python3.9[107062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:34 np0005540826 python3.9[107140]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:35.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:36 np0005540826 python3.9[107292]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:56:36 np0005540826 systemd[1]: Starting Time & Date Service...
Dec  1 04:56:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:36.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:36 np0005540826 systemd[1]: Started Time & Date Service.
Dec  1 04:56:37 np0005540826 python3.9[107449]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:37.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:56:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29ac000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:37 np0005540826 python3.9[107638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:38.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:38 np0005540826 python3.9[107721]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:38 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.015808) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999015841, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1328, "num_deletes": 250, "total_data_size": 3376632, "memory_usage": 3403432, "flush_reason": "Manual Compaction"}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999026023, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1331906, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10663, "largest_seqno": 11986, "table_properties": {"data_size": 1327527, "index_size": 1903, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11125, "raw_average_key_size": 20, "raw_value_size": 1318060, "raw_average_value_size": 2370, "num_data_blocks": 84, "num_entries": 556, "num_filter_entries": 556, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582880, "oldest_key_time": 1764582880, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 10279 microseconds, and 3687 cpu microseconds.
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.026061) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1331906 bytes OK
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.026106) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027144) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027156) EVENT_LOG_v1 {"time_micros": 1764582999027152, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027171) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3370374, prev total WAL file size 3370374, number of live WAL files 2.
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028112) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1300KB)], [21(14MB)]
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999028168, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16142239, "oldest_snapshot_seqno": -1}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4225 keys, 13855731 bytes, temperature: kUnknown
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999089582, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13855731, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13823854, "index_size": 20183, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 107624, "raw_average_key_size": 25, "raw_value_size": 13742906, "raw_average_value_size": 3252, "num_data_blocks": 863, "num_entries": 4225, "num_filter_entries": 4225, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.089942) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13855731 bytes
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.091806) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 262.3 rd, 225.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.1 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(22.5) write-amplify(10.4) OK, records in: 4687, records dropped: 462 output_compression: NoCompression
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.091827) EVENT_LOG_v1 {"time_micros": 1764582999091816, "job": 10, "event": "compaction_finished", "compaction_time_micros": 61542, "compaction_time_cpu_micros": 28315, "output_level": 6, "num_output_files": 1, "total_output_size": 13855731, "num_input_records": 4687, "num_output_records": 4225, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999092216, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999094997, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.095121) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.095130) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.095132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.095134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:56:39.095136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:39 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:39 np0005540826 python3.9[107873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:39.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:39 np0005540826 python3.9[107951]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.iqj28_ff recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:39 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:40.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:40 np0005540826 python3.9[108104]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:40 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:40 np0005540826 python3.9[108182]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095641 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:56:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:41 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:41.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:41 np0005540826 python3.9[108334]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:56:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:41 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:42.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:42 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:42 np0005540826 python3[108488]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:56:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:43 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:43.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:43 np0005540826 python3.9[108640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:43 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:44 np0005540826 python3.9[108718]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:44.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:44 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:44 np0005540826 python3.9[108921]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:45 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:56:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:56:45 np0005540826 python3.9[109030]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:45.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:45 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:46.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:46 np0005540826 python3.9[109183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:46 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:47 np0005540826 python3.9[109261]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:47 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:47.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:47 np0005540826 python3.9[109413]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:47 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:48 np0005540826 python3.9[109491]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:48.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:48 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:49 np0005540826 python3.9[109644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:49 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:49.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:49 np0005540826 python3.9[109722]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:49 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:50.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:50 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:50 np0005540826 python3.9[109875]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:56:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:51 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:56:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:51.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:56:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:51 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:51 np0005540826 python3.9[110055]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:51 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:56:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:52.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:56:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:52 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:52 np0005540826 python3.9[110208]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:53 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:53 np0005540826 python3.9[110360]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:53.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:53 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:54.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:54 np0005540826 python3.9[110512]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:56:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:54 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:55 np0005540826 python3.9[110665]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:56:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:55 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:55.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:55 np0005540826 systemd-logind[787]: Session 43 logged out. Waiting for processes to exit.
Dec  1 04:56:55 np0005540826 systemd[1]: session-43.scope: Deactivated successfully.
Dec  1 04:56:55 np0005540826 systemd[1]: session-43.scope: Consumed 31.125s CPU time.
Dec  1 04:56:55 np0005540826 systemd-logind[787]: Removed session 43.
Dec  1 04:56:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:55 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:56.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:56 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:57 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:57.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:57 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:58.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:58 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:59 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:56:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:56:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:59.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:56:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:56:59 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:00.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:00 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:00 np0005540826 systemd-logind[787]: New session 44 of user zuul.
Dec  1 04:57:00 np0005540826 systemd[1]: Started Session 44 of User zuul.
Dec  1 04:57:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:01 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:01.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:01 np0005540826 python3.9[110873]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:57:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:01 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:02.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:02 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:02 np0005540826 python3.9[111026]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:03 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:03.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:03 np0005540826 python3.9[111180]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec  1 04:57:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:03 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:57:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:04.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:57:04 np0005540826 python3.9[111333]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.65lm164c follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:04 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:05 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:05 np0005540826 python3.9[111458]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.65lm164c mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583024.056017-103-27894456906594/.source.65lm164c _original_basename=.mz3rkz88 follow=False checksum=8dc09b174cc5b8debe148224e7d00f23d70f4242 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:05.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:05 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:06.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:06 np0005540826 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:57:06 np0005540826 python3.9[111611]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:06 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:07 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:07.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:07 np0005540826 python3.9[111765]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=#012 create=True mode=0644 path=/tmp/ansible.65lm164c state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:07 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:08.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:08 np0005540826 python3.9[111918]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.65lm164c' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:08 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2988003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:09 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:09.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:09 np0005540826 python3.9[112074]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.65lm164c state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:09 np0005540826 systemd[1]: session-44.scope: Deactivated successfully.
Dec  1 04:57:09 np0005540826 systemd[1]: session-44.scope: Consumed 5.335s CPU time.
Dec  1 04:57:09 np0005540826 systemd-logind[787]: Session 44 logged out. Waiting for processes to exit.
Dec  1 04:57:09 np0005540826 systemd-logind[787]: Removed session 44.
Dec  1 04:57:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:09 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:10.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:10 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:11 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:11.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:11 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:12.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:12 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:13 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29780016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:13.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:13 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:14.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:14 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:15 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:15 np0005540826 systemd-logind[787]: New session 45 of user zuul.
Dec  1 04:57:15 np0005540826 systemd[1]: Started Session 45 of User zuul.
Dec  1 04:57:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:15.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:15 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29780016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:16.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:16 np0005540826 python3.9[112255]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:16 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:17 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:17.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:17 np0005540826 python3.9[112437]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:57:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:17 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:18.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:18 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29780016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:18 np0005540826 python3.9[112592]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:57:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:19 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:19.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:19 np0005540826 python3.9[112745]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:19 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:20.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:20 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:20 np0005540826 python3.9[112899]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:21 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:21.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:21 np0005540826 python3.9[113051]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:21 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:22.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:22 np0005540826 systemd[1]: session-45.scope: Deactivated successfully.
Dec  1 04:57:22 np0005540826 systemd[1]: session-45.scope: Consumed 4.156s CPU time.
Dec  1 04:57:22 np0005540826 systemd-logind[787]: Session 45 logged out. Waiting for processes to exit.
Dec  1 04:57:22 np0005540826 systemd-logind[787]: Removed session 45.
Dec  1 04:57:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:22 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:23 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:23.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:23 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:24.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:24 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:25.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095725 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:57:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:25 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:26.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:26 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:27 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:57:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:27.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:57:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:27 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:28.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:28 np0005540826 systemd-logind[787]: New session 46 of user zuul.
Dec  1 04:57:28 np0005540826 systemd[1]: Started Session 46 of User zuul.
Dec  1 04:57:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:28 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:29 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:29 np0005540826 python3.9[113234]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:29 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:30 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:30 np0005540826 python3.9[113391]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:57:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:31 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:31.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:31 np0005540826 python3.9[113475]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:57:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:31 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:32.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:32 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:33 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:33.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:33 np0005540826 python3.9[113627]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:33 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:57:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:33 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:34.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:34 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:35 np0005540826 python3.9[113779]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:57:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:35 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:57:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:35 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:35.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:35 np0005540826 python3.9[113930]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:36 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:36.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:36 np0005540826 python3.9[114081]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:36 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:36 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:57:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:36 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:57:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:37 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:37 np0005540826 systemd[1]: session-46.scope: Deactivated successfully.
Dec  1 04:57:37 np0005540826 systemd[1]: session-46.scope: Consumed 5.879s CPU time.
Dec  1 04:57:37 np0005540826 systemd-logind[787]: Session 46 logged out. Waiting for processes to exit.
Dec  1 04:57:37 np0005540826 systemd-logind[787]: Removed session 46.
Dec  1 04:57:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:37.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:38 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:38.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:38 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f297c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:39 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2978003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:39.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:39 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:57:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:40 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:40.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:40 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:41 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29880014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:41.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:42 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29880022c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:42 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:43 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:43.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:43 np0005540826 systemd-logind[787]: New session 47 of user zuul.
Dec  1 04:57:43 np0005540826 systemd[1]: Started Session 47 of User zuul.
Dec  1 04:57:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:44 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29a8001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:44.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:44 np0005540826 python3.9[114290]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:44 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29880022c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:45 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2984004530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:45.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095745 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:57:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:46 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29940033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:46 np0005540826 python3.9[114447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:46 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29a8001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:47 np0005540826 python3.9[114599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:47 np0005540826 kernel: ganesha.nfsd[114132]: segfault at 50 ip 00007f2a5a60a32e sp 00007f2a13ffe210 error 4 in libntirpc.so.5.8[7f2a5a5ef000+2c000] likely on CPU 7 (core 0, socket 7)
Dec  1 04:57:47 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:57:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[106377]: 01/12/2025 09:57:47 : epoch 692d6649 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f29880022c0 fd 38 proxy ignored for local
Dec  1 04:57:47 np0005540826 systemd[1]: Started Process Core Dump (PID 114624/UID 0).
Dec  1 04:57:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:47.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:47 np0005540826 python3.9[114753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:48.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:48 np0005540826 systemd-coredump[114625]: Process 106381 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007f2a5a60a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:57:48 np0005540826 systemd[1]: systemd-coredump@3-114624-0.service: Deactivated successfully.
Dec  1 04:57:48 np0005540826 systemd[1]: systemd-coredump@3-114624-0.service: Consumed 1.222s CPU time.
Dec  1 04:57:48 np0005540826 podman[114882]: 2025-12-01 09:57:48.603823333 +0000 UTC m=+0.042174217 container died 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  1 04:57:48 np0005540826 systemd[1]: var-lib-containers-storage-overlay-ffa527e6adc1af04ac0a0706dd9f2618ef1e4a2193605a121df13301dc85fe8d-merged.mount: Deactivated successfully.
Dec  1 04:57:48 np0005540826 python3.9[114878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583067.2920523-157-8034597846268/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e432d28218c2bf9861e59af10936eed3a4f9b5c8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:48 np0005540826 podman[114882]: 2025-12-01 09:57:48.649355078 +0000 UTC m=+0.087705952 container remove 30edbe2666d8c7b68ba15d8d5234b2457f8262c9136500c9912331454c46b4d5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:57:48 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:57:48 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 04:57:48 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.681s CPU time.
Dec  1 04:57:49 np0005540826 python3.9[115076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:49.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:49 np0005540826 python3.9[115199]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583068.8133204-157-63953825615925/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=cb547b0bb0278866a992ba3ec36d52c9fc332990 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:50.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:50 np0005540826 python3.9[115352]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:51 np0005540826 python3.9[115475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583070.0533614-157-251405574647170/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=f0893166aff7a4615e4af27f691bb525d6abf510 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:51.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:51 np0005540826 python3.9[115748]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:52.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:57:52 np0005540826 python3.9[115932]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:53 np0005540826 python3.9[116084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095753 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:57:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:53.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:53 np0005540826 python3.9[116207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583072.68376-338-225642471069691/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c0f9ff9538769a66afc747130029cd279ebe3fc0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:54 np0005540826 python3.9[116360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:54.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:54 np0005540826 python3.9[116483]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583073.8413322-338-129911833965585/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:55 np0005540826 python3.9[116635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:55.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:56 np0005540826 python3.9[116758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583075.0261438-338-184743730317076/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=231c521b354536452bef811d2d0565bbc3263a5c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:57:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:56.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:57:56 np0005540826 python3.9[116911]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:57 np0005540826 python3.9[117088]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:57.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:58 np0005540826 python3.9[117265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:58.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:58 np0005540826 python3.9[117389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583077.6554894-517-42552465301326/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=7dc9fb95f3afa944bb393e392fabfd4c89dc9178 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:58 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 4.
Dec  1 04:57:58 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:57:58 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.681s CPU time.
Dec  1 04:57:58 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:57:59 np0005540826 podman[117587]: 2025-12-01 09:57:59.155879024 +0000 UTC m=+0.037743999 container create 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 04:57:59 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dfe7b1104917abc52c46fe9bf1a5a9acf3f92565bc1c45a61f31c00e4d04ddf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:59 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dfe7b1104917abc52c46fe9bf1a5a9acf3f92565bc1c45a61f31c00e4d04ddf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:59 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dfe7b1104917abc52c46fe9bf1a5a9acf3f92565bc1c45a61f31c00e4d04ddf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:59 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8dfe7b1104917abc52c46fe9bf1a5a9acf3f92565bc1c45a61f31c00e4d04ddf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:59 np0005540826 podman[117587]: 2025-12-01 09:57:59.217755182 +0000 UTC m=+0.099620197 container init 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Dec  1 04:57:59 np0005540826 podman[117587]: 2025-12-01 09:57:59.224568245 +0000 UTC m=+0.106433240 container start 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:57:59 np0005540826 bash[117587]: 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40
Dec  1 04:57:59 np0005540826 podman[117587]: 2025-12-01 09:57:59.13957589 +0000 UTC m=+0.021440885 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:57:59 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:57:59 np0005540826 python3.9[117586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:57:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:57:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:57:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:57:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:59.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:59 np0005540826 python3.9[117766]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583078.834332-517-75831155130033/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:00.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:00 np0005540826 python3.9[117919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:00 np0005540826 python3.9[118042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583079.960501-517-236751374953523/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b05c51e808af3331625c133029486fae575f165e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:01.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:02 np0005540826 python3.9[118194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:02.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:02 np0005540826 python3.9[118347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:03 np0005540826 python3.9[118470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583082.3756979-715-55987971662051/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:03.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:04 np0005540826 python3.9[118622]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:04.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:04 np0005540826 python3.9[118775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:05 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:58:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:05 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:58:05 np0005540826 python3.9[118898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583084.4118276-789-133843186301693/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:05.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:06 np0005540826 python3.9[119050]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:06.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:06 np0005540826 python3.9[119203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:07 np0005540826 python3.9[119326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583086.3387964-861-107306470695258/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:07.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:07 np0005540826 python3.9[119478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:08.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:08 np0005540826 python3.9[119631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:09 np0005540826 python3.9[119754]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583088.1616683-928-242730808204023/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:09.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:09 np0005540826 python3.9[119906]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:10 np0005540826 python3.9[120059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:10.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:10 np0005540826 python3.9[120182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583089.9533079-998-202417258822018/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:58:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:58:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:11.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:11 np0005540826 python3.9[120346]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:12 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:12 np0005540826 python3.9[120498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:12.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:12 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:12 np0005540826 python3.9[120626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583091.7707684-1068-102370909176389/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:13 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:13.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:14 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:14.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:14 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:15 np0005540826 systemd[1]: session-47.scope: Deactivated successfully.
Dec  1 04:58:15 np0005540826 systemd[1]: session-47.scope: Consumed 23.336s CPU time.
Dec  1 04:58:15 np0005540826 systemd-logind[787]: Session 47 logged out. Waiting for processes to exit.
Dec  1 04:58:15 np0005540826 systemd-logind[787]: Removed session 47.
Dec  1 04:58:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095815 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:58:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:15 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:15.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:16 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:58:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:16.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:58:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:16 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:17 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:17.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:18 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:18.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:18 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.772269) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098772357, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1244, "num_deletes": 251, "total_data_size": 3168506, "memory_usage": 3223632, "flush_reason": "Manual Compaction"}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098785628, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2042220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11991, "largest_seqno": 13230, "table_properties": {"data_size": 2036843, "index_size": 2837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11298, "raw_average_key_size": 19, "raw_value_size": 2026011, "raw_average_value_size": 3457, "num_data_blocks": 126, "num_entries": 586, "num_filter_entries": 586, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583000, "oldest_key_time": 1764583000, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 13449 microseconds, and 6265 cpu microseconds.
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.785720) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2042220 bytes OK
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.785757) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789258) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789284) EVENT_LOG_v1 {"time_micros": 1764583098789276, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789312) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3162547, prev total WAL file size 3162547, number of live WAL files 2.
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.791001) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1994KB)], [24(13MB)]
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098791113, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15897951, "oldest_snapshot_seqno": -1}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4293 keys, 13794158 bytes, temperature: kUnknown
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098849560, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13794158, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13762614, "index_size": 19722, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109804, "raw_average_key_size": 25, "raw_value_size": 13681204, "raw_average_value_size": 3186, "num_data_blocks": 832, "num_entries": 4293, "num_filter_entries": 4293, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.849840) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13794158 bytes
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.851018) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 271.7 rd, 235.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.2 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(14.5) write-amplify(6.8) OK, records in: 4811, records dropped: 518 output_compression: NoCompression
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.851040) EVENT_LOG_v1 {"time_micros": 1764583098851030, "job": 12, "event": "compaction_finished", "compaction_time_micros": 58522, "compaction_time_cpu_micros": 28341, "output_level": 6, "num_output_files": 1, "total_output_size": 13794158, "num_input_records": 4811, "num_output_records": 4293, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098851686, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098854703, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.790877) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.854742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.854747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.854749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.854752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-09:58:18.854753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:19 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:19.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:20 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:20.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:20 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:21 np0005540826 systemd-logind[787]: New session 48 of user zuul.
Dec  1 04:58:21 np0005540826 systemd[1]: Started Session 48 of User zuul.
Dec  1 04:58:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:21 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:21.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:21 np0005540826 python3.9[120835]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:22 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:22.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:22 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:22 np0005540826 python3.9[120988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:23 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:23.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:23 np0005540826 python3.9[121111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583102.2482057-63-18975961415357/.source.conf _original_basename=ceph.conf follow=False checksum=0a8180f0f80a13ef358ded9b1ade2f059a9b256f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:24 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd380095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:24 np0005540826 python3.9[121263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:24.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:24 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:24 np0005540826 python3.9[121387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583103.7943432-63-129664983817248/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=5a16a5bd4a7ebcbad903a4d80924389de6535d80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:25 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:25 np0005540826 systemd[1]: session-48.scope: Deactivated successfully.
Dec  1 04:58:25 np0005540826 systemd[1]: session-48.scope: Consumed 2.845s CPU time.
Dec  1 04:58:25 np0005540826 systemd-logind[787]: Session 48 logged out. Waiting for processes to exit.
Dec  1 04:58:25 np0005540826 systemd-logind[787]: Removed session 48.
Dec  1 04:58:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:25.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:26 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:26.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:58:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 7126 writes, 30K keys, 7126 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 7126 writes, 1175 syncs, 6.06 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7126 writes, 30K keys, 7126 commit groups, 1.0 writes per commit group, ingest: 20.44 MB, 0.03 MB/s#012Interval WAL: 7126 writes, 1175 syncs, 6.06 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 04:58:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:26 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:27 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:27.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:28 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:28.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:28 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:29 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:29.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:30 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:30.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:30 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:30 np0005540826 systemd-logind[787]: New session 49 of user zuul.
Dec  1 04:58:30 np0005540826 systemd[1]: Started Session 49 of User zuul.
Dec  1 04:58:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:31 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:31.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:32 np0005540826 python3.9[121568]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:58:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:32 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:32.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:32 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:33 np0005540826 python3.9[121725]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:33 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:58:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:58:33 np0005540826 python3.9[121877]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:34 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:34.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:34 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:34 np0005540826 python3.9[122028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:58:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:35 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:35.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:35 np0005540826 python3.9[122180]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 04:58:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:36 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:36.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:36 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:37 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:37.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:37 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  1 04:58:38 np0005540826 python3.9[122360]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:58:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:38 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:38.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:38 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:38 np0005540826 python3.9[122447]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:58:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:39 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:39.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:40 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:40.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:40 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:41 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:41 np0005540826 python3.9[122601]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:58:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:58:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:41.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:58:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:42 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:42 np0005540826 python3[122756]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  1 04:58:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:42.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:42 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:43 np0005540826 python3.9[122909]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:43 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:43.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:44 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:44 np0005540826 python3.9[123063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:44.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:44 np0005540826 python3.9[123142]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:44 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:45 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:45 np0005540826 python3.9[123294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:45.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:45 np0005540826 python3.9[123372]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.m7ix6q3g recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:46 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:46.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:46 np0005540826 python3.9[123526]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:46 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:47 np0005540826 python3.9[123604]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:47 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:47.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:48 np0005540826 python3.9[123756]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:48 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:48.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:48 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:48 np0005540826 python3[123910]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:58:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:49 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:49.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:49 np0005540826 python3.9[124062]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:50 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:50.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:50 np0005540826 python3.9[124188]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583129.210028-432-208536361175976/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:50 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:51 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:51 np0005540826 python3.9[124340]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:51.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:51 np0005540826 python3.9[124465]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583130.7760503-477-196075617494134/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:52 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:52.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:52 np0005540826 python3.9[124618]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:52 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:53 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:53 np0005540826 python3.9[124743]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583132.2319639-522-185386861775650/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:53.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:54 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:54 np0005540826 python3.9[124896]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:54.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:54 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:54 np0005540826 python3.9[125021]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583133.6081202-567-108008281510540/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:55 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:55.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:55 np0005540826 python3.9[125173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:56 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:56 np0005540826 python3.9[125299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583135.1480167-612-151594631888729/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:56.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:56 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:57 np0005540826 python3.9[125451]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:57 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:58:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:57.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:58:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:58 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:58 np0005540826 python3.9[125693]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:58:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:58.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:58:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:58 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:58:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:58 np0005540826 python3.9[125865]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:58:59 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:58:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:59.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:59 np0005540826 python3.9[126017]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:59:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:00 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:00 np0005540826 python3.9[126171]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:00.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:00 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:01 np0005540826 python3.9[126325]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:01 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:01.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:01 np0005540826 python3.9[126480]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:02 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:02.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:02 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:03 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:03 np0005540826 python3.9[126631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:59:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:03.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:04 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:04.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:04 np0005540826 python3.9[126794]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:04 np0005540826 ovs-vsctl[126811]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  1 04:59:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:04 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:05 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:59:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:59:05 np0005540826 python3.9[126963]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:05.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:06 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:06 np0005540826 python3.9[127118]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:06 np0005540826 ovs-vsctl[127120]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  1 04:59:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:06.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:06 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:06 np0005540826 python3.9[127270]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:07 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:07.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:07 np0005540826 python3.9[127424]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:08 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:08.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:08 np0005540826 python3.9[127577]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:08 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:09 np0005540826 python3.9[127655]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:09 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:09.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:09 np0005540826 python3.9[127807]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:10 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009f80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:10 np0005540826 python3.9[127885]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:10.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:10 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:11 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:11 np0005540826 python3.9[128038]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:11.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:12 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:12 np0005540826 python3.9[128190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:12.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:12 np0005540826 python3.9[128269]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:12 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd38009fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:13 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd14004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:13 np0005540826 python3.9[128421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:13.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:13 np0005540826 python3.9[128499]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:14 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:14 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:14 np0005540826 python3.9[128653]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:14 np0005540826 systemd[1]: Reloading.
Dec  1 04:59:14 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:14 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:15 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd200023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:15.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:15 np0005540826 python3.9[128844]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095916 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:16 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:16 np0005540826 python3.9[128923]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:16.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:16 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:17 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:17 np0005540826 python3.9[129075]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:17.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:17 np0005540826 python3.9[129153]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:18 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:18.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:18 np0005540826 python3.9[129331]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:18 np0005540826 systemd[1]: Reloading.
Dec  1 04:59:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:18 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:18 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:18 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:19 np0005540826 systemd[1]: Starting Create netns directory...
Dec  1 04:59:19 np0005540826 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:59:19 np0005540826 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:59:19 np0005540826 systemd[1]: Finished Create netns directory.
Dec  1 04:59:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:19 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:59:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2204 writes, 13K keys, 2204 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2204 writes, 2204 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2204 writes, 13K keys, 2204 commit groups, 1.0 writes per commit group, ingest: 38.29 MB, 0.06 MB/s#012Interval WAL: 2204 writes, 2204 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    133.0      0.17              0.07         6    0.028       0      0       0.0       0.0#012  L6      1/0   13.16 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0    189.0    166.4      0.39              0.19         5    0.078     22K   2298       0.0       0.0#012 Sum      1/0   13.16 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    132.9    156.5      0.56              0.26        11    0.051     22K   2298       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    133.3    157.0      0.55              0.26        10    0.055     22K   2298       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    189.0    166.4      0.39              0.19         5    0.078     22K   2298       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    134.4      0.16              0.07         5    0.033       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.021#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d4317d9350#2 capacity: 304.00 MB usage: 1.53 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(88,1.32 MB,0.434087%) FilterBlock(11,71.55 KB,0.0229836%) IndexBlock(11,138.77 KB,0.0445767%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:59:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:19.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:20 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:20 np0005540826 python3.9[129525]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:20.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:20 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:20 np0005540826 python3.9[129678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:21 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:21 np0005540826 python3.9[129801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583160.3581347-1365-160221066550800/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:21.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:22 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:22.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:22 np0005540826 python3.9[129954]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:22 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:23 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:23 np0005540826 python3.9[130106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:23.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:24 np0005540826 python3.9[130229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583162.9641593-1440-215669696908338/.source.json _original_basename=.o2rw92i1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:24 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:24 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:59:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:24.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:24 np0005540826 python3.9[130382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:24 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:25 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:25.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:26 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:26.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:26 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:27 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:27 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:59:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:27 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:59:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:27 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:59:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:27.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:27 np0005540826 python3.9[130810]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  1 04:59:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:28 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:28.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:28 np0005540826 python3.9[130963]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:59:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:28 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:29 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:29.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:29 np0005540826 python3.9[131115]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 04:59:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:30 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:30 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:59:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:30.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:30 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:31 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0038e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:31.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:31 np0005540826 python3[131295]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:59:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:32 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:32.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:32 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:33 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:33.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:34 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c0038e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:34.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:34 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:35 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:35.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095936 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:59:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:36 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:36.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:36 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:37 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:37 np0005540826 podman[131307]: 2025-12-01 09:59:37.726987458 +0000 UTC m=+5.794929789 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:37 np0005540826 podman[131425]: 2025-12-01 09:59:37.926005462 +0000 UTC m=+0.080728383 container create 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec  1 04:59:37 np0005540826 podman[131425]: 2025-12-01 09:59:37.889615932 +0000 UTC m=+0.044338953 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:37 np0005540826 python3[131295]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:38 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:38.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:38 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd00003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:38 np0005540826 python3.9[131642]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:39 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd0c004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:39.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:39 np0005540826 python3.9[131796]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:40 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdcf8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:40 np0005540826 python3.9[131872]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:40.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:40 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:40 np0005540826 python3.9[132024]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583180.3285246-1704-180729206918904/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:41 np0005540826 kernel: ganesha.nfsd[120338]: segfault at 50 ip 00007fddddd5e32e sp 00007fdd9b7fd210 error 4 in libntirpc.so.5.8[7fddddd43000+2c000] likely on CPU 3 (core 0, socket 3)
Dec  1 04:59:41 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:59:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[117602]: 01/12/2025 09:59:41 : epoch 692d66a7 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdd20004060 fd 38 proxy ignored for local
Dec  1 04:59:41 np0005540826 systemd[1]: Started Process Core Dump (PID 132101/UID 0).
Dec  1 04:59:41 np0005540826 python3.9[132100]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:59:41 np0005540826 systemd[1]: Reloading.
Dec  1 04:59:41 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:41 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:41.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:42.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:43.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:43 np0005540826 python3.9[132216]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:43 np0005540826 systemd[1]: Reloading.
Dec  1 04:59:43 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:43 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:44 np0005540826 systemd-coredump[132102]: Process 117606 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007fddddd5e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:59:44 np0005540826 podman[132259]: 2025-12-01 09:59:44.312098742 +0000 UTC m=+0.036407541 container died 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 04:59:44 np0005540826 systemd[1]: systemd-coredump@4-132101-0.service: Deactivated successfully.
Dec  1 04:59:44 np0005540826 systemd[1]: systemd-coredump@4-132101-0.service: Consumed 2.673s CPU time.
Dec  1 04:59:44 np0005540826 systemd[1]: var-lib-containers-storage-overlay-8dfe7b1104917abc52c46fe9bf1a5a9acf3f92565bc1c45a61f31c00e4d04ddf-merged.mount: Deactivated successfully.
Dec  1 04:59:44 np0005540826 podman[132259]: 2025-12-01 09:59:44.367439376 +0000 UTC m=+0.091748185 container remove 7afbc568252b123e1174f83599752aae8d4ff49bac5b1eacc9effd976fa2de40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:59:44 np0005540826 systemd[1]: Starting ovn_controller container...
Dec  1 04:59:44 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:59:44 np0005540826 systemd[1]: Started libcrun container.
Dec  1 04:59:44 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c08d424fb1e381a1883323d07235020d5d1d9bf844b0084e503841b7a326745/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:44 np0005540826 systemd[1]: Started /usr/bin/podman healthcheck run 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55.
Dec  1 04:59:44 np0005540826 podman[132281]: 2025-12-01 09:59:44.531873287 +0000 UTC m=+0.139598627 container init 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 04:59:44 np0005540826 ovn_controller[132309]: + sudo -E kolla_set_configs
Dec  1 04:59:44 np0005540826 podman[132281]: 2025-12-01 09:59:44.579277468 +0000 UTC m=+0.187002808 container start 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  1 04:59:44 np0005540826 edpm-start-podman-container[132281]: ovn_controller
Dec  1 04:59:44 np0005540826 systemd[1]: Created slice User Slice of UID 0.
Dec  1 04:59:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:44.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:44 np0005540826 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  1 04:59:44 np0005540826 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  1 04:59:44 np0005540826 systemd[1]: Starting User Manager for UID 0...
Dec  1 04:59:44 np0005540826 edpm-start-podman-container[132279]: Creating additional drop-in dependency for "ovn_controller" (6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55)
Dec  1 04:59:44 np0005540826 podman[132316]: 2025-12-01 09:59:44.67019661 +0000 UTC m=+0.076523085 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 04:59:44 np0005540826 systemd[1]: 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55-5f19908073bcc8f1.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 04:59:44 np0005540826 systemd[1]: 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55-5f19908073bcc8f1.service: Failed with result 'exit-code'.
Dec  1 04:59:44 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 04:59:44 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 2.088s CPU time.
Dec  1 04:59:44 np0005540826 systemd[1]: Reloading.
Dec  1 04:59:44 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:44 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:44 np0005540826 systemd[132351]: Queued start job for default target Main User Target.
Dec  1 04:59:44 np0005540826 systemd[132351]: Created slice User Application Slice.
Dec  1 04:59:44 np0005540826 systemd[132351]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  1 04:59:44 np0005540826 systemd[132351]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:59:44 np0005540826 systemd[132351]: Reached target Paths.
Dec  1 04:59:44 np0005540826 systemd[132351]: Reached target Timers.
Dec  1 04:59:44 np0005540826 systemd[132351]: Starting D-Bus User Message Bus Socket...
Dec  1 04:59:44 np0005540826 systemd[132351]: Starting Create User's Volatile Files and Directories...
Dec  1 04:59:44 np0005540826 systemd[132351]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:59:44 np0005540826 systemd[132351]: Finished Create User's Volatile Files and Directories.
Dec  1 04:59:44 np0005540826 systemd[132351]: Reached target Sockets.
Dec  1 04:59:44 np0005540826 systemd[132351]: Reached target Basic System.
Dec  1 04:59:44 np0005540826 systemd[132351]: Reached target Main User Target.
Dec  1 04:59:44 np0005540826 systemd[132351]: Startup finished in 163ms.
Dec  1 04:59:44 np0005540826 systemd[1]: Started User Manager for UID 0.
Dec  1 04:59:44 np0005540826 systemd[1]: Started ovn_controller container.
Dec  1 04:59:44 np0005540826 systemd[1]: Started Session c1 of User root.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: INFO:__main__:Validating config file
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: INFO:__main__:Writing out command to execute
Dec  1 04:59:45 np0005540826 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: ++ cat /run_command
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + ARGS=
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + sudo kolla_copy_cacerts
Dec  1 04:59:45 np0005540826 systemd[1]: Started Session c2 of User root.
Dec  1 04:59:45 np0005540826 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + [[ ! -n '' ]]
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + . kolla_extend_start
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + umask 0022
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.1827] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.1839] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.1853] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.1858] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.1862] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  1 04:59:45 np0005540826 kernel: br-int: entered promiscuous mode
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00019|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:45 np0005540826 ovn_controller[132309]: 2025-12-01T09:59:45Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.2067] manager: (ovn-4d9738-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.2076] manager: (ovn-968d9d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.2083] manager: (ovn-9a0c85-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec  1 04:59:45 np0005540826 systemd-udevd[132453]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:59:45 np0005540826 kernel: genev_sys_6081: entered promiscuous mode
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.2730] device (genev_sys_6081): carrier: link connected
Dec  1 04:59:45 np0005540826 NetworkManager[48989]: <info>  [1764583185.2735] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec  1 04:59:45 np0005540826 systemd-udevd[132459]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:59:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:45.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:46 np0005540826 python3.9[132592]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:46 np0005540826 ovs-vsctl[132594]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  1 04:59:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:47 np0005540826 python3.9[132746]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:47 np0005540826 ovs-vsctl[132748]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  1 04:59:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:47.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:48 np0005540826 python3.9[132902]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:48 np0005540826 ovs-vsctl[132903]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  1 04:59:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:48.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:48 np0005540826 systemd[1]: session-49.scope: Deactivated successfully.
Dec  1 04:59:48 np0005540826 systemd-logind[787]: Session 49 logged out. Waiting for processes to exit.
Dec  1 04:59:48 np0005540826 systemd[1]: session-49.scope: Consumed 1min 248ms CPU time.
Dec  1 04:59:48 np0005540826 systemd-logind[787]: Removed session 49.
Dec  1 04:59:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095949 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:49.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/095950 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:50.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:51.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:52.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:53.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:54.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:54 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 5.
Dec  1 04:59:54 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:54 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 2.088s CPU time.
Dec  1 04:59:54 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:59:54 np0005540826 systemd-logind[787]: New session 51 of user zuul.
Dec  1 04:59:54 np0005540826 systemd[1]: Started Session 51 of User zuul.
Dec  1 04:59:55 np0005540826 podman[133028]: 2025-12-01 09:59:55.111708793 +0000 UTC m=+0.047474084 container create 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:59:55 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5d4d864cc50e5c673533fcabc44108e5b00d0000a14e3fd9db3b9dc258a96a/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:55 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5d4d864cc50e5c673533fcabc44108e5b00d0000a14e3fd9db3b9dc258a96a/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:55 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5d4d864cc50e5c673533fcabc44108e5b00d0000a14e3fd9db3b9dc258a96a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:55 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab5d4d864cc50e5c673533fcabc44108e5b00d0000a14e3fd9db3b9dc258a96a/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:55 np0005540826 podman[133028]: 2025-12-01 09:59:55.091532107 +0000 UTC m=+0.027297428 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:59:55 np0005540826 systemd[1]: Stopping User Manager for UID 0...
Dec  1 04:59:55 np0005540826 systemd[132351]: Activating special unit Exit the Session...
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped target Main User Target.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped target Basic System.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped target Paths.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped target Sockets.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped target Timers.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:59:55 np0005540826 systemd[132351]: Closed D-Bus User Message Bus Socket.
Dec  1 04:59:55 np0005540826 systemd[132351]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:59:55 np0005540826 systemd[132351]: Removed slice User Application Slice.
Dec  1 04:59:55 np0005540826 systemd[132351]: Reached target Shutdown.
Dec  1 04:59:55 np0005540826 systemd[132351]: Finished Exit the Session.
Dec  1 04:59:55 np0005540826 systemd[132351]: Reached target Exit the Session.
Dec  1 04:59:55 np0005540826 podman[133028]: 2025-12-01 09:59:55.408878354 +0000 UTC m=+0.344643665 container init 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:59:55 np0005540826 systemd[1]: user@0.service: Deactivated successfully.
Dec  1 04:59:55 np0005540826 systemd[1]: Stopped User Manager for UID 0.
Dec  1 04:59:55 np0005540826 podman[133028]: 2025-12-01 09:59:55.414117188 +0000 UTC m=+0.349882479 container start 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:59:55 np0005540826 bash[133028]: 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6
Dec  1 04:59:55 np0005540826 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  1 04:59:55 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:55 np0005540826 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  1 04:59:55 np0005540826 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  1 04:59:55 np0005540826 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  1 04:59:55 np0005540826 systemd[1]: Removed slice User Slice of UID 0.
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:59:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 09:59:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:59:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:55.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:55 np0005540826 python3.9[133185]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:59:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:56.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:57 np0005540826 python3.9[133342]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 04:59:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:57.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 04:59:58 np0005540826 python3.9[133494]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:58.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:58 np0005540826 python3.9[133672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:59 np0005540826 python3.9[133824]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 04:59:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:59.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:00 np0005540826 python3.9[133976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:00.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:01 np0005540826 ceph-mon[80026]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Dec  1 05:00:01 np0005540826 ceph-mon[80026]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:00:01 np0005540826 ceph-mon[80026]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:00:01 np0005540826 ceph-mon[80026]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec  1 05:00:01 np0005540826 ceph-mon[80026]:    daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0 is in unknown state
Dec  1 05:00:01 np0005540826 python3.9[134127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:00:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:01 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:00:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:01 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:01.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:02 np0005540826 python3.9[134279]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 05:00:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:02.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:00:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:03.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:00:03 np0005540826 python3.9[134431]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:00:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:00:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:00:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:04 np0005540826 python3.9[134553]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583203.2422118-219-204827376077432/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:04.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:05 np0005540826 python3.9[134772]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:05.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:05 np0005540826 python3.9[134905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583204.8119302-264-17367165112571/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:00:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:05 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:00:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:06 np0005540826 python3.9[135058]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:00:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:07.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:07 np0005540826 python3.9[135142]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:00:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:08 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6348000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:08.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:08 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:09 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:09.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:10 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:10.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:10 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100011 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:11 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c000fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:11 np0005540826 python3.9[135336]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:00:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:11.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100012 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:12 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:12 np0005540826 python3.9[135490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:12 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:13 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:13 np0005540826 python3.9[135611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583212.3325543-375-208395713249804/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:13.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:14 np0005540826 python3.9[135761]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:14 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c001ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:14 np0005540826 python3.9[135883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583213.5487454-375-86745085018216/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:14.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:14 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:15 np0005540826 ovn_controller[132309]: 2025-12-01T10:00:15Z|00025|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Dec  1 05:00:15 np0005540826 ovn_controller[132309]: 2025-12-01T10:00:15Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Dec  1 05:00:15 np0005540826 podman[135908]: 2025-12-01 10:00:15.040971672 +0000 UTC m=+0.113693107 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  1 05:00:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:15 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:00:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:15.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:00:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:16 np0005540826 python3.9[136060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:16 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:16.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:16 np0005540826 python3.9[136182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583215.6828148-507-13367205087176/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:16 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c001ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:17 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:17 np0005540826 python3.9[136332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:17.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:17 np0005540826 python3.9[136453]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583216.8735206-507-52817680166214/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:18 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:18.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:18 np0005540826 python3.9[136629]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:18 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:19 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:19 np0005540826 python3.9[136783]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:19.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:20 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:20 np0005540826 python3.9[136936]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:20.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:20 np0005540826 python3.9[137014]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:20 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:21 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:21 np0005540826 python3.9[137166]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:21.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:21 np0005540826 python3.9[137244]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:22 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:22.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:22 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:23 np0005540826 python3.9[137397]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:23 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:23.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:23 np0005540826 python3.9[137549]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:24 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:24 np0005540826 python3.9[137628]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:24.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:24 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:25 np0005540826 python3.9[137780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:25 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:25 np0005540826 python3.9[137858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:25.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100026 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:00:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:26.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:27 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:27 np0005540826 python3.9[138011]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:00:27 np0005540826 systemd[1]: Reloading.
Dec  1 05:00:27 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:00:27 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:00:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:28 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:28.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:28 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:29 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:29 np0005540826 python3.9[138201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:29.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:30 np0005540826 python3.9[138279]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:30 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:30.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:30 np0005540826 python3.9[138432]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:30 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:31 np0005540826 python3.9[138510]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:31 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:31.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:32 np0005540826 python3.9[138662]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:00:32 np0005540826 systemd[1]: Reloading.
Dec  1 05:00:32 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:00:32 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:00:32 np0005540826 systemd[1]: Starting Create netns directory...
Dec  1 05:00:32 np0005540826 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 05:00:32 np0005540826 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 05:00:32 np0005540826 systemd[1]: Finished Create netns directory.
Dec  1 05:00:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:32.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:33 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:33 np0005540826 python3.9[138858]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:33.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:00:34 np0005540826 python3.9[139011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:34 np0005540826 python3.9[139134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583233.8753922-960-274982367282441/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:35.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:35 np0005540826 python3.9[139286]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:36.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:36 np0005540826 python3.9[139439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:00:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:37 np0005540826 python3.9[139562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583236.344359-1035-41069132943918/.source.json _original_basename=.hp0vq4gr follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:37.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:38 np0005540826 python3.9[139714]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:38 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f632c004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:38.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:38 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003960 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:39 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:00:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:39.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:00:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:00:40 np0005540826 python3.9[140170]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  1 05:00:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:40.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:41 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:41 np0005540826 python3.9[140322]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:00:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:41.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:42 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:42 np0005540826 python3.9[140475]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 05:00:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:42.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:42 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:43 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:43.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:44 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320000ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:44 np0005540826 python3[140656]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:00:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:44.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:44 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320000ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:45 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:45.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100046 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:46 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:46.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:46 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:47 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:47 np0005540826 podman[140719]: 2025-12-01 10:00:47.730464588 +0000 UTC m=+1.872848017 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:00:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:48 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:48 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318001b40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:49.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:49 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:49.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:50 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:50 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:51.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:51 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318001b40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:51.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:52 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:52 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:53.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:53 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:53.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:54 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:54 np0005540826 podman[140669]: 2025-12-01 10:00:54.273870851 +0000 UTC m=+9.812659189 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:54 np0005540826 podman[140822]: 2025-12-01 10:00:54.437406824 +0000 UTC m=+0.060146488 container create 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:00:54 np0005540826 podman[140822]: 2025-12-01 10:00:54.40511679 +0000 UTC m=+0.027856464 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:54 np0005540826 python3[140656]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:54 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:55.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:56 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:56 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:57.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:57 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:58 np0005540826 python3.9[141011]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:58 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:58 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:59 np0005540826 python3.9[141194]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:00:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:59.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:00:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:00:59 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:59 np0005540826 python3.9[141270]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:00:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:59.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:00 np0005540826 python3.9[141421]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583259.602328-1299-81148561606873/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:00 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:00 np0005540826 python3.9[141498]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:00 np0005540826 systemd[1]: Reloading.
Dec  1 05:01:00 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:00 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:00 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:01.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:01 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:01 np0005540826 python3.9[141623]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:01 np0005540826 systemd[1]: Reloading.
Dec  1 05:01:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:01.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:01 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:01 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:02 np0005540826 systemd[1]: Starting ovn_metadata_agent container...
Dec  1 05:01:02 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:01:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:02 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:02 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df3fca8ac1ad60c5ef671f7c7812b94f91686e6352156ae6252633e9151e67d7/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:02 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df3fca8ac1ad60c5ef671f7c7812b94f91686e6352156ae6252633e9151e67d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:02 np0005540826 systemd[1]: Started /usr/bin/podman healthcheck run 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f.
Dec  1 05:01:02 np0005540826 podman[141664]: 2025-12-01 10:01:02.300036498 +0000 UTC m=+0.157633715 container init 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + sudo -E kolla_set_configs
Dec  1 05:01:02 np0005540826 podman[141664]: 2025-12-01 10:01:02.323721128 +0000 UTC m=+0.181318345 container start 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:01:02 np0005540826 edpm-start-podman-container[141664]: ovn_metadata_agent
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Validating config file
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Copying service configuration files
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Writing out command to execute
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  1 05:01:02 np0005540826 edpm-start-podman-container[141663]: Creating additional drop-in dependency for "ovn_metadata_agent" (2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f)
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: ++ cat /run_command
Dec  1 05:01:02 np0005540826 podman[141687]: 2025-12-01 10:01:02.40069844 +0000 UTC m=+0.062171202 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + CMD=neutron-ovn-metadata-agent
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + ARGS=
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + sudo kolla_copy_cacerts
Dec  1 05:01:02 np0005540826 systemd[1]: Reloading.
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + [[ ! -n '' ]]
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + . kolla_extend_start
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: Running command: 'neutron-ovn-metadata-agent'
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + umask 0022
Dec  1 05:01:02 np0005540826 ovn_metadata_agent[141680]: + exec neutron-ovn-metadata-agent
Dec  1 05:01:02 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:02 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:02 np0005540826 systemd[1]: Started ovn_metadata_agent container.
Dec  1 05:01:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:02 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:03.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:03 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:03 np0005540826 systemd[1]: session-51.scope: Deactivated successfully.
Dec  1 05:01:03 np0005540826 systemd[1]: session-51.scope: Consumed 57.348s CPU time.
Dec  1 05:01:03 np0005540826 systemd-logind[787]: Session 51 logged out. Waiting for processes to exit.
Dec  1 05:01:03 np0005540826 systemd-logind[787]: Removed session 51.
Dec  1 05:01:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:03.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.489 141685 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.489 141685 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.489 141685 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.490 141685 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.491 141685 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.492 141685 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.493 141685 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.494 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.495 141685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.496 141685 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.497 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.498 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.499 141685 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.500 141685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.501 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.502 141685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.503 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.504 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.505 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.506 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.507 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.508 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.509 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.510 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.511 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.512 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.513 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.514 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.515 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.516 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.517 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.518 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.519 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.520 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.521 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.522 141685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.522 141685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.522 141685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.522 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.522 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.523 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.524 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.525 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.526 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.527 141685 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.538 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.538 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.539 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.539 141685 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.539 141685 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.560 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name b99910e3-15ec-4cc7-b887-f5229f22d165 (UUID: b99910e3-15ec-4cc7-b887-f5229f22d165) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.586 141685 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.586 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.587 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.587 141685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.589 141685 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.595 141685 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.603 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'b99910e3-15ec-4cc7-b887-f5229f22d165'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], external_ids={}, name=b99910e3-15ec-4cc7-b887-f5229f22d165, nb_cfg_timestamp=1764583193201, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.604 141685 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe07c650f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.605 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.605 141685 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.605 141685 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.606 141685 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.611 141685 DEBUG oslo_service.service [-] Started child 141792 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.615 141685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp36k9jmiq/privsep.sock']#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.615 141792 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-957663'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.638 141792 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.639 141792 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.639 141792 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.643 141792 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.649 141792 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 05:01:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:04.656 141792 INFO eventlet.wsgi.server [-] (141792) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  1 05:01:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:05.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:05 np0005540826 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  1 05:01:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:05 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.412 141685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.413 141685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp36k9jmiq/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.229 141797 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.236 141797 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.240 141797 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.240 141797 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141797#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.415 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8020ff-5975-4aed-aef1-28ee5a5ad909]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:01:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.964 141797 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.964 141797 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:01:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:05.964 141797 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:01:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:06 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.551 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[f80b5202-a6db-4586-9e35-b3d9625b7047]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.554 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, column=external_ids, values=({'neutron:ovn-metadata-id': '13c562a3-f4c1-5f47-b061-d0bae17c419b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.563 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.569 141685 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.570 141685 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.571 141685 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.572 141685 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.573 141685 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.574 141685 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.575 141685 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.576 141685 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.577 141685 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.578 141685 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.579 141685 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.580 141685 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.581 141685 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.582 141685 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.583 141685 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.584 141685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.585 141685 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.586 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.587 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.588 141685 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.589 141685 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.590 141685 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.591 141685 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.592 141685 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.593 141685 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.594 141685 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.595 141685 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.596 141685 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.597 141685 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.598 141685 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.599 141685 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.600 141685 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.601 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.602 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.603 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.604 141685 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.605 141685 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.605 141685 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:01:06.605 141685 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:01:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:06 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:07.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:07.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:08 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:08 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:09.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:09 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:09 np0005540826 systemd-logind[787]: New session 52 of user zuul.
Dec  1 05:01:09 np0005540826 systemd[1]: Started Session 52 of User zuul.
Dec  1 05:01:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:10 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:10 np0005540826 python3.9[141958]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:01:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:10 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:11.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:11 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:12 np0005540826 python3.9[142183]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:12 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:12 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:13.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:13 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:13 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:01:13 np0005540826 python3.9[142361]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:13 np0005540826 systemd[1]: Reloading.
Dec  1 05:01:13 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:13 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100114 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:14 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:01:14 np0005540826 python3.9[142547]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:01:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:14 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:14 np0005540826 network[142564]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:01:14 np0005540826 network[142565]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:01:14 np0005540826 network[142566]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:01:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:15.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:15 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:16 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:16 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:17.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:17 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:17.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:18 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100118 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:18 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:19.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:19 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:19.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:20 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:20 np0005540826 python3.9[142883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:20 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:20 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:20 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:21.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:21 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:21 np0005540826 python3.9[143036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:21.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:22 np0005540826 podman[143161]: 2025-12-01 10:01:22.016686769 +0000 UTC m=+0.132944468 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  1 05:01:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:22 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:22 np0005540826 python3.9[143206]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:22 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:23 np0005540826 python3.9[143369]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:23.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:23 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:23 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:01:23 np0005540826 python3.9[143522]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:23.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:24 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:24 np0005540826 python3.9[143676]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:24 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:25.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:25 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:25.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:26 np0005540826 python3.9[143831]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:01:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:01:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:01:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:27.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:27 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:27 np0005540826 python3.9[143988]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:27.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:28 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:28 np0005540826 python3.9[144141]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:28 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:29 np0005540826 python3.9[144293]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:29.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:29 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:29 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:01:29 np0005540826 python3.9[144445]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:29.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:30 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:30 np0005540826 python3.9[144598]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:30 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:31.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:31 np0005540826 python3.9[144750]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:31 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:31 np0005540826 python3.9[144902]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:32 np0005540826 podman[145027]: 2025-12-01 10:01:32.658954133 +0000 UTC m=+0.063581503 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:01:32 np0005540826 python3.9[145074]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:01:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:33.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:33 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:33 np0005540826 python3.9[145226]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:34 np0005540826 python3.9[145378]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:34 np0005540826 python3.9[145531]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:35.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:35 np0005540826 python3.9[145683]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:35.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:01:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:01:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100136 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:01:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:36 np0005540826 python3.9[145836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:01:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:37 np0005540826 python3.9[145988]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:37.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:37.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:38 np0005540826 python3.9[146141]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:38 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:38 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:39.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:39 np0005540826 python3.9[146318]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:01:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:39 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:39.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:40 np0005540826 python3.9[146470]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:40 np0005540826 systemd[1]: Reloading.
Dec  1 05:01:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:40 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:40 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100140 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:01:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:41.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:41 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:41 np0005540826 python3.9[146658]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:42 np0005540826 python3.9[146811]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:42 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:42 np0005540826 python3.9[146965]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:42 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:43.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:43 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:43 np0005540826 python3.9[147118]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:44 np0005540826 python3.9[147271]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:44 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:44 np0005540826 python3.9[147425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:44 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:45.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:45 np0005540826 python3.9[147578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:45 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:45.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:46 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:46 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:47.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:47 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:47 np0005540826 python3.9[147732]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  1 05:01:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:48 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:48 np0005540826 python3.9[147886]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 05:01:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:49 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c0025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:49.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:49 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:49.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:50 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:50 np0005540826 python3.9[148047]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 05:01:50 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:01:50 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:01:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:51 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:51.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:51 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:51.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100152 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:52 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:52 np0005540826 podman[148181]: 2025-12-01 10:01:52.532211764 +0000 UTC m=+0.100023372 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:01:52 np0005540826 python3.9[148224]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:01:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:53 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:53.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:53 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:53 np0005540826 python3.9[148317]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:01:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:53.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:54 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:55.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:55 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:55.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:56 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:57 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:01:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:57.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:01:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:57 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:57.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:58 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:59 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:01:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:59.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:01:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:01:59 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:01:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:59.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:00 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:00 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:02:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:01 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:01.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:01 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:01.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:02 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:02 np0005540826 podman[148377]: 2025-12-01 10:02:02.997280206 +0000 UTC m=+0.068030491 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 05:02:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:03 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:03.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:03 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:03 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:02:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:03 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:02:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:04 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:02:04.531 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:02:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:02:04.533 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:02:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:02:04.533 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:02:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:05 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:02:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:05.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:02:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:05 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:05.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:06 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:06 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:02:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:07.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:07 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:02:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:07.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:02:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:08 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:09 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:09.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:09 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:10 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:11 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:11.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:11 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:11.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100212 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:02:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:12 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:13 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:13.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:13 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:14 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:15 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:02:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:15.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:02:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:15 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:16 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:17 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:17.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:17 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:17.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:18 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:19 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6318003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:19.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:19 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:19.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:20 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:21 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63440044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:21.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:21 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:02:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:21 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c001bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:21.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:22 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:22 np0005540826 kernel: SELinux:  Converting 2771 SID table entries...
Dec  1 05:02:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:02:22 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:02:22 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  1 05:02:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:23 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63200012e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:23 np0005540826 podman[148683]: 2025-12-01 10:02:23.046012513 +0000 UTC m=+0.112004817 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:02:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:23.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:23 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:24 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c001bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:25 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140040d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:25.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:25 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63200012e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:25.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:26 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:27 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c001bb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:27.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:27 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140040f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:27.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:28 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63200021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:28 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:28 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:29 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:29.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:29 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:29.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:30 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:31 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63200021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:31.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:31 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:31.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:32 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:33 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:33 np0005540826 kernel: SELinux:  Converting 2771 SID table entries...
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:02:33 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:02:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:33.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:33 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320002ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:33 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  1 05:02:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:33.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:33 np0005540826 podman[148747]: 2025-12-01 10:02:33.995156718 +0000 UTC m=+0.066552167 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:02:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:34 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:35.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:35 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:35.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:36 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320002ee0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:37.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:37 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:37.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:38 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:39 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:39.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:39 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:39.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:40 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004190 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:41 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:41.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:41 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:41.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100242 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:02:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:42 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:43 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140041b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:43.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:43 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:43.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:44 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:45 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:02:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:45.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:02:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:45 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140041d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:45 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:45.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:46 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:47 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:47.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:47 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:47.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:48 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f63140041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:49 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:49.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:49 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:50.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:50 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:51 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6314004210 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:51.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:51 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f633c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:52.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:52 : epoch 692d671b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:02:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:52 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6324004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:53 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6320003bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:53.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[133044]: 01/12/2025 10:02:53 : epoch 692d671b : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6344003920 fd 39 proxy ignored for local
Dec  1 05:02:53 np0005540826 kernel: ganesha.nfsd[152969]: segfault at 50 ip 00007f63f655332e sp 00007f63afffe210 error 4 in libntirpc.so.5.8[7f63f6538000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:02:53 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:02:53 np0005540826 systemd[1]: Started Process Core Dump (PID 153556/UID 0).
Dec  1 05:02:53 np0005540826 podman[153571]: 2025-12-01 10:02:53.549575949 +0000 UTC m=+0.102912450 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller)
Dec  1 05:02:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:54.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:55 np0005540826 systemd-coredump[153582]: Process 133056 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 66:#012#0  0x00007f63f655332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:02:55 np0005540826 systemd[1]: systemd-coredump@5-153556-0.service: Deactivated successfully.
Dec  1 05:02:55 np0005540826 systemd[1]: systemd-coredump@5-153556-0.service: Consumed 1.389s CPU time.
Dec  1 05:02:55 np0005540826 podman[154582]: 2025-12-01 10:02:55.247027618 +0000 UTC m=+0.025871207 container died 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1)
Dec  1 05:02:55 np0005540826 systemd[1]: var-lib-containers-storage-overlay-ab5d4d864cc50e5c673533fcabc44108e5b00d0000a14e3fd9db3b9dc258a96a-merged.mount: Deactivated successfully.
Dec  1 05:02:55 np0005540826 podman[154582]: 2025-12-01 10:02:55.301457633 +0000 UTC m=+0.080301222 container remove 1192288e0cc1942c5cb4c668320781d307b355624ff2f41330c5b4eb21512de6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Dec  1 05:02:55 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:02:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:55.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:55 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:02:55 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.843s CPU time.
Dec  1 05:02:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:56.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:02:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:58.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:02:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:02:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:59.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100259 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:03:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:00.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:01.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:02.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:03.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:04.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100304 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:03:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:03:04.532 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:03:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:03:04.533 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:03:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:03:04.533 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:03:04 np0005540826 podman[160030]: 2025-12-01 10:03:04.973995892 +0000 UTC m=+0.054453337 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:03:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:05.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:05 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 6.
Dec  1 05:03:05 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:03:05 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.843s CPU time.
Dec  1 05:03:05 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:03:05 np0005540826 podman[160593]: 2025-12-01 10:03:05.875372628 +0000 UTC m=+0.049383598 container create 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 05:03:05 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4290427fc7952b359d9766abfabad9cb592511b7a0a73e255bf414248b1b3e5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:03:05 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4290427fc7952b359d9766abfabad9cb592511b7a0a73e255bf414248b1b3e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:03:05 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4290427fc7952b359d9766abfabad9cb592511b7a0a73e255bf414248b1b3e5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:03:05 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4290427fc7952b359d9766abfabad9cb592511b7a0a73e255bf414248b1b3e5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:03:05 np0005540826 podman[160593]: 2025-12-01 10:03:05.941474962 +0000 UTC m=+0.115485962 container init 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec  1 05:03:05 np0005540826 podman[160593]: 2025-12-01 10:03:05.852962907 +0000 UTC m=+0.026973957 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:03:05 np0005540826 podman[160593]: 2025-12-01 10:03:05.959294489 +0000 UTC m=+0.133305459 container start 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  1 05:03:05 np0005540826 bash[160593]: 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d
Dec  1 05:03:05 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:03:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:03:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:03:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:06.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:03:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:03:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:07.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:08.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:09.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:03:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:10.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:03:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:11.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:12.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:12 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:03:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:12 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:03:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:13.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:14.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:16.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:17.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:03:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:18.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:03:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e9c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c001970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:19.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c001970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:20.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:20 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:21.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100321 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:03:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:22.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:23.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:24 np0005540826 podman[165873]: 2025-12-01 10:03:24.053501629 +0000 UTC m=+0.111709440 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec  1 05:03:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:24.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:24 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:25.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:26.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:26 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:27.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:28.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:28 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:28 np0005540826 podman[166023]: 2025-12-01 10:03:28.557351554 +0000 UTC m=+0.081975143 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:03:28 np0005540826 podman[166023]: 2025-12-01 10:03:28.655519087 +0000 UTC m=+0.180142646 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 05:03:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:29.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:29 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:29 np0005540826 podman[166145]: 2025-12-01 10:03:29.739688168 +0000 UTC m=+0.632673327 container exec b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:03:29 np0005540826 podman[166145]: 2025-12-01 10:03:29.781879073 +0000 UTC m=+0.674864152 container exec_died b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:03:30 np0005540826 kernel: SELinux:  Converting 2772 SID table entries...
Dec  1 05:03:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:03:30 np0005540826 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:03:30 np0005540826 podman[166234]: 2025-12-01 10:03:30.201376208 +0000 UTC m=+0.110013928 container exec 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:03:30 np0005540826 podman[166234]: 2025-12-01 10:03:30.21639058 +0000 UTC m=+0.125028270 container exec_died 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec  1 05:03:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:30 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:30 np0005540826 podman[166297]: 2025-12-01 10:03:30.466292294 +0000 UTC m=+0.062919835 container exec 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:03:30 np0005540826 podman[166297]: 2025-12-01 10:03:30.479578904 +0000 UTC m=+0.076206445 container exec_died 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:03:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:31 np0005540826 podman[166360]: 2025-12-01 10:03:31.046849395 +0000 UTC m=+0.096202956 container exec b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.buildah.version=1.28.2, name=keepalived, com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  1 05:03:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74001d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:31 np0005540826 podman[166360]: 2025-12-01 10:03:31.08985936 +0000 UTC m=+0.139212941 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, distribution-scope=public)
Dec  1 05:03:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:31.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:31 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  1 05:03:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:31 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 05:03:31 np0005540826 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Dec  1 05:03:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:03:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:32.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:03:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:32 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:03:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:03:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:33.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74001d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:34.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:34 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:35.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:35 np0005540826 podman[166544]: 2025-12-01 10:03:35.497171633 +0000 UTC m=+0.088875799 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:03:35 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:36.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:36 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:37.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:37 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:37 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:38.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:38 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:39.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:39 np0005540826 systemd[1]: Stopping OpenSSH server daemon...
Dec  1 05:03:39 np0005540826 systemd[1]: sshd.service: Deactivated successfully.
Dec  1 05:03:39 np0005540826 systemd[1]: Stopped OpenSSH server daemon.
Dec  1 05:03:39 np0005540826 systemd[1]: sshd.service: Consumed 3.246s CPU time, read 32.0K from disk, written 32.0K to disk.
Dec  1 05:03:39 np0005540826 systemd[1]: Stopped target sshd-keygen.target.
Dec  1 05:03:39 np0005540826 systemd[1]: Stopping sshd-keygen.target...
Dec  1 05:03:39 np0005540826 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:39 np0005540826 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:39 np0005540826 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:39 np0005540826 systemd[1]: Reached target sshd-keygen.target.
Dec  1 05:03:39 np0005540826 systemd[1]: Starting OpenSSH server daemon...
Dec  1 05:03:40 np0005540826 systemd[1]: Started OpenSSH server daemon.
Dec  1 05:03:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:40.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:40 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:41.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:41 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 05:03:41 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 05:03:41 np0005540826 systemd[1]: Reloading.
Dec  1 05:03:42 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:03:42 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:03:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:42.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:42 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 05:03:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:42 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:43.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:44.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:44 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:45.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:46.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:46 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:47.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:48.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:48 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c002680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:49.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:50.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:50 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:51.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:52.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:52 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 05:03:52 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 05:03:52 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 11.096s CPU time.
Dec  1 05:03:52 np0005540826 systemd[1]: run-rae023bc129c84148b29849384a18a4e1.service: Deactivated successfully.
Dec  1 05:03:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:52 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:53.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:54.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:54 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:55 np0005540826 podman[176103]: 2025-12-01 10:03:55.028304563 +0000 UTC m=+0.100207833 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:03:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:55.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:56.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:56 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:57.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:58.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:58 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:03:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:03:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:00.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:00 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:02.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:02 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:03.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:04.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:04 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:04:04.534 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:04:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:04:04.534 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:04:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:04:04.535 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:04:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:05 np0005540826 podman[176160]: 2025-12-01 10:04:05.976485584 +0000 UTC m=+0.061644375 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  1 05:04:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:06.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:07 np0005540826 python3.9[176307]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:07 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:07 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:07 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:07 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:07 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003580 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:07.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:08.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:08 np0005540826 python3.9[176496]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:08 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:04:08 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:08 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:08 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:09 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:09 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:09 np0005540826 python3.9[176686]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:09.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:09 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:09 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:09 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:10.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:10 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:10 np0005540826 python3.9[176877]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:10 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:10 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:10 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:11 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e98003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:11 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:11.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:12.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:12 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:12 np0005540826 python3.9[177068]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:12 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:12 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:12 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:13 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:13 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:13 np0005540826 python3.9[177258]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:14 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:14 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:14 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:14.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:14 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:15 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:15 np0005540826 python3.9[177449]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:15 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:15 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:15 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:15 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:15.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:16.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:16 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:16 np0005540826 python3.9[177640]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:17 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:17 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:17.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:17 np0005540826 python3.9[177795]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:17 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:17 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:17 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:04:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:18.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:19.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:20 np0005540826 python3.9[178011]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:04:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:20.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:04:20 np0005540826 systemd[1]: Reloading.
Dec  1 05:04:20 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:20 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:20 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003f60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:20 np0005540826 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  1 05:04:20 np0005540826 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  1 05:04:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:04:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:04:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:21.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:21 np0005540826 python3.9[178205]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:22.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:22 np0005540826 python3.9[178361]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:23 np0005540826 python3.9[178517]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:23.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:24 np0005540826 python3.9[178672]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:24 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:04:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:24.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:24 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:24 np0005540826 python3.9[178828]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:25 np0005540826 podman[178955]: 2025-12-01 10:04:25.433449557 +0000 UTC m=+0.099868472 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:04:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:25.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:25 np0005540826 python3.9[179002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:26.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:26 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:26 np0005540826 python3.9[179165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e980045e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:27 np0005540826 python3.9[179320]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:28.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:28 np0005540826 python3.9[179475]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:28 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:29 np0005540826 python3.9[179632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:04:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:29.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:04:29 np0005540826 python3.9[179787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:30.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100430 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:04:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:30 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:30 np0005540826 python3.9[179943]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:04:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:31.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:04:31 np0005540826 python3.9[180098]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:32.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:32 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:04:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:33.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:04:33 np0005540826 python3.9[180254]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:04:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:34.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:04:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:34 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:34 np0005540826 python3.9[180410]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:35 np0005540826 python3.9[180562]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c002110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:35.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:35 np0005540826 python3.9[180714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:36.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:36 np0005540826 podman[180839]: 2025-12-01 10:04:36.366222989 +0000 UTC m=+0.056287073 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 05:04:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:36 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:36 np0005540826 python3.9[180887]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:37 np0005540826 python3.9[181039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:37.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:37 np0005540826 python3.9[181214]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:38.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:38 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c003bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:04:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:04:38 np0005540826 python3.9[181426]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:39.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:39 np0005540826 python3.9[181576]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583478.2972226-1623-204923077816404/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:40.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:40 np0005540826 python3.9[181729]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:40 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:40 np0005540826 python3.9[181854]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583479.8763697-1623-158045185938358/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c003bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:41.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:41 np0005540826 python3.9[182006]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:42.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:42 np0005540826 python3.9[182131]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583481.1658566-1623-73801556832491/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:42 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:42 np0005540826 python3.9[182284]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:43.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:43 np0005540826 python3.9[182409]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583482.3785782-1623-253646447088068/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:44.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:44 np0005540826 python3.9[182586]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:44 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:44 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:44 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:44 np0005540826 python3.9[182712]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583483.7187934-1623-144988296050153/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:04:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:45.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:04:45 np0005540826 python3.9[182864]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:46 np0005540826 python3.9[182989]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583485.0947535-1623-155234951221309/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:46.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:46 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90003ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:46 np0005540826 python3.9[183142]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:47 np0005540826 python3.9[183265]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583486.3526733-1623-80506955528145/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:47.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:47 np0005540826 python3.9[183417]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:48.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:48 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:48 np0005540826 python3.9[183543]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583487.5263884-1623-265231394021405/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:04:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:49.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:04:49 np0005540826 python3.9[183695]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  1 05:04:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:04:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:50.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:04:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:50 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:50 np0005540826 python3.9[183849]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:51 np0005540826 python3.9[184001]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:51.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:51 np0005540826 python3.9[184153]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:04:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:52.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:04:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:52 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:52 np0005540826 python3.9[184306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:53 np0005540826 python3.9[184459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:53.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:53 np0005540826 python3.9[184611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:54.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:54 np0005540826 python3.9[184764]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:54 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c001e00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:55 np0005540826 python3.9[184916]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:55.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:56 np0005540826 podman[185042]: 2025-12-01 10:04:56.193631496 +0000 UTC m=+0.089041699 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller)
Dec  1 05:04:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:56.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:56 np0005540826 python3.9[185092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:56 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:57 np0005540826 python3.9[185249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c0011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:57.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:57 np0005540826 python3.9[185401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:58.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:58 np0005540826 python3.9[185554]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:58 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:59 np0005540826 python3.9[185706]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:04:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c0011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:04:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:59.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.785469) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499785537, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4275, "num_deletes": 502, "total_data_size": 11678924, "memory_usage": 11864088, "flush_reason": "Manual Compaction"}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec  1 05:04:59 np0005540826 python3.9[185882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499822549, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4370129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13235, "largest_seqno": 17505, "table_properties": {"data_size": 4358823, "index_size": 6328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30747, "raw_average_key_size": 19, "raw_value_size": 4331874, "raw_average_value_size": 2805, "num_data_blocks": 275, "num_entries": 1544, "num_filter_entries": 1544, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583099, "oldest_key_time": 1764583099, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 37709 microseconds, and 11258 cpu microseconds.
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.823170) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4370129 bytes OK
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.823197) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.826030) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.826118) EVENT_LOG_v1 {"time_micros": 1764583499826112, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.826142) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11659909, prev total WAL file size 11659909, number of live WAL files 2.
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.829252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353032' seq:0, type:0; will stop at (end)
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4267KB)], [27(13MB)]
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499829293, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18164287, "oldest_snapshot_seqno": -1}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5008 keys, 13509363 bytes, temperature: kUnknown
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499905797, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13509363, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13474390, "index_size": 21359, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 125432, "raw_average_key_size": 25, "raw_value_size": 13381917, "raw_average_value_size": 2672, "num_data_blocks": 892, "num_entries": 5008, "num_filter_entries": 5008, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.906047) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13509363 bytes
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.907642) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.2 rd, 176.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.2 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(7.2) write-amplify(3.1) OK, records in: 5837, records dropped: 829 output_compression: NoCompression
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.907663) EVENT_LOG_v1 {"time_micros": 1764583499907650, "job": 14, "event": "compaction_finished", "compaction_time_micros": 76582, "compaction_time_cpu_micros": 32241, "output_level": 6, "num_output_files": 1, "total_output_size": 13509363, "num_input_records": 5837, "num_output_records": 5008, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499908744, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499911659, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.829199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.911714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.911719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.911721) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.911724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:04:59.911725) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:00.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:00 np0005540826 python3.9[186036]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:00 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:01 np0005540826 python3.9[186159]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583500.0165608-2286-67537401071490/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:01 np0005540826 python3.9[186311]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:02.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:02 np0005540826 python3.9[186435]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583501.2302053-2286-260413162251510/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:02 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c001380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:02 np0005540826 python3.9[186587]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:03 np0005540826 python3.9[186710]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583502.4884088-2286-65190494420110/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:04.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:04 np0005540826 python3.9[186862]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:04 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:05:04.536 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:05:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:05:04.538 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:05:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:05:04.538 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:05:04 np0005540826 python3.9[186986]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583503.7070644-2286-166710155976422/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c001520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:05 np0005540826 python3.9[187138]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:05 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:05 np0005540826 python3.9[187261]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583504.9646418-2286-43478262788809/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:06.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:06 np0005540826 podman[187386]: 2025-12-01 10:05:06.455040922 +0000 UTC m=+0.056260322 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:05:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:06 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:06 np0005540826 python3.9[187433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:07 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:07 np0005540826 python3.9[187556]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583506.1419778-2286-242613154909278/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:07 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:07 np0005540826 python3.9[187708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000052s ======
Dec  1 05:05:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:08.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Dec  1 05:05:08 np0005540826 python3.9[187832]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583507.3393319-2286-138191844292480/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:08 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:08 np0005540826 python3.9[187984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:09 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:09 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:09 np0005540826 python3.9[188107]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583508.4935875-2286-265057291081894/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:09.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:10.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:10 np0005540826 python3.9[188259]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:10 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c003910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:10 np0005540826 python3.9[188383]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583509.7119007-2286-107162509019590/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100511 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:11 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:11 np0005540826 python3.9[188535]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:11 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:11.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:11 np0005540826 python3.9[188658]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583510.9481971-2286-24015933288314/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:12.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:12 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:12 np0005540826 python3.9[188811]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:13 np0005540826 python3.9[188934]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583512.095891-2286-201334388615437/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:13 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:13 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:13.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:14 np0005540826 python3.9[189086]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:14.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100514 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:14 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:14 np0005540826 python3.9[189210]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583513.251873-2286-13638669221554/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:15 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:15 np0005540826 python3.9[189362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:15 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:15.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:15 np0005540826 python3.9[189485]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583514.814931-2286-27892337933334/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:16.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:16 np0005540826 python3.9[189638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:16 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:17 np0005540826 python3.9[189761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583516.0283127-2286-65340472453758/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:17 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:17 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:17.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:18.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:18 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:18 np0005540826 python3.9[189912]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:19 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:05:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:19.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:19 np0005540826 python3.9[190090]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  1 05:05:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:20.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:20 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:21 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:21.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:21 np0005540826 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  1 05:05:22 np0005540826 python3.9[190249]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:22.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:05:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:22 np0005540826 python3.9[190402]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:22 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:23 np0005540826 python3.9[190554]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:23 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:23.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:23 np0005540826 python3.9[190706]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:24.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:24 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:24 np0005540826 python3.9[190859]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:25 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:25.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:25 np0005540826 python3.9[191011]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:26.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:26 np0005540826 podman[191164]: 2025-12-01 10:05:26.352895223 +0000 UTC m=+0.092152551 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec  1 05:05:26 np0005540826 python3.9[191165]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:26 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:27 np0005540826 python3.9[191345]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:27 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e780014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:27.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:27 np0005540826 python3.9[191497]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:28.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:28 np0005540826 python3.9[191649]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:28 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:05:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:28 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:29 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:29.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:29 np0005540826 python3.9[191802]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:29 np0005540826 systemd[1]: Reloading.
Dec  1 05:05:29 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:29 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:30 np0005540826 systemd[1]: Starting libvirt logging daemon socket...
Dec  1 05:05:30 np0005540826 systemd[1]: Listening on libvirt logging daemon socket.
Dec  1 05:05:30 np0005540826 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  1 05:05:30 np0005540826 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  1 05:05:30 np0005540826 systemd[1]: Starting libvirt logging daemon...
Dec  1 05:05:30 np0005540826 systemd[1]: Started libvirt logging daemon.
Dec  1 05:05:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:30.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:30 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e780014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:30 np0005540826 python3.9[191997]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:30 np0005540826 systemd[1]: Reloading.
Dec  1 05:05:31 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:31 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100531 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:05:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:31 np0005540826 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  1 05:05:31 np0005540826 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  1 05:05:31 np0005540826 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  1 05:05:31 np0005540826 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  1 05:05:31 np0005540826 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  1 05:05:31 np0005540826 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  1 05:05:31 np0005540826 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 05:05:31 np0005540826 systemd[1]: Started libvirt nodedev daemon.
Dec  1 05:05:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:31 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:31.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:32 np0005540826 python3.9[192213]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:32 np0005540826 systemd[1]: Reloading.
Dec  1 05:05:32 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:32 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:32.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:32 np0005540826 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  1 05:05:32 np0005540826 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  1 05:05:32 np0005540826 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  1 05:05:32 np0005540826 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  1 05:05:32 np0005540826 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  1 05:05:32 np0005540826 systemd[1]: Starting libvirt proxy daemon...
Dec  1 05:05:32 np0005540826 systemd[1]: Started libvirt proxy daemon.
Dec  1 05:05:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:32 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:32 np0005540826 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  1 05:05:32 np0005540826 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  1 05:05:32 np0005540826 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  1 05:05:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e780014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:33 np0005540826 python3.9[192433]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:33 np0005540826 systemd[1]: Reloading.
Dec  1 05:05:33 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:33 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:33 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:33.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:33 np0005540826 systemd[1]: Listening on libvirt locking daemon socket.
Dec  1 05:05:33 np0005540826 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  1 05:05:33 np0005540826 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  1 05:05:33 np0005540826 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  1 05:05:33 np0005540826 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  1 05:05:33 np0005540826 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  1 05:05:33 np0005540826 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  1 05:05:33 np0005540826 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  1 05:05:33 np0005540826 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  1 05:05:33 np0005540826 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  1 05:05:33 np0005540826 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 05:05:33 np0005540826 systemd[1]: Started libvirt QEMU daemon.
Dec  1 05:05:33 np0005540826 setroubleshoot[192251]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 65a2a17e-9938-4163-918d-3c4c68e6732e
Dec  1 05:05:33 np0005540826 setroubleshoot[192251]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 05:05:33 np0005540826 setroubleshoot[192251]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 65a2a17e-9938-4163-918d-3c4c68e6732e
Dec  1 05:05:33 np0005540826 setroubleshoot[192251]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 05:05:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:05:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:05:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100534 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:05:34 np0005540826 python3.9[192651]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:34 np0005540826 systemd[1]: Reloading.
Dec  1 05:05:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:34 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:34 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:34 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:34 np0005540826 systemd[1]: Starting libvirt secret daemon socket...
Dec  1 05:05:34 np0005540826 systemd[1]: Listening on libvirt secret daemon socket.
Dec  1 05:05:34 np0005540826 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  1 05:05:34 np0005540826 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  1 05:05:34 np0005540826 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  1 05:05:34 np0005540826 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  1 05:05:34 np0005540826 systemd[1]: Starting libvirt secret daemon...
Dec  1 05:05:34 np0005540826 systemd[1]: Started libvirt secret daemon.
Dec  1 05:05:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:35 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e780014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:36.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:36 np0005540826 python3.9[192864]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:36 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:36 np0005540826 podman[192965]: 2025-12-01 10:05:36.985684961 +0000 UTC m=+0.063630617 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:05:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:37 np0005540826 python3.9[193035]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:05:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:37 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:37.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:38 np0005540826 python3.9[193187]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:38.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:38 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:38 np0005540826 python3.9[193342]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:05:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:39 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:39.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:39 np0005540826 python3.9[193492]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:40.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:40 np0005540826 python3.9[193639]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583539.3422425-3360-49765141861895/.source.xml follow=False _original_basename=secret.xml.j2 checksum=b828192784cecb28a4416a509fc39e7cc46c1495 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:40 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68003fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:41 np0005540826 python3.9[193791]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 365f19c2-81e5-5edd-b6b4-280555214d3a#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:41 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:05:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:41.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:05:42 np0005540826 python3.9[193953]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:42.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.507461) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542507501, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 623, "num_deletes": 251, "total_data_size": 1178631, "memory_usage": 1200848, "flush_reason": "Manual Compaction"}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542514866, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 772229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17511, "largest_seqno": 18128, "table_properties": {"data_size": 769107, "index_size": 1094, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7040, "raw_average_key_size": 18, "raw_value_size": 762989, "raw_average_value_size": 2040, "num_data_blocks": 49, "num_entries": 374, "num_filter_entries": 374, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583500, "oldest_key_time": 1764583500, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7470 microseconds, and 3591 cpu microseconds.
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514927) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 772229 bytes OK
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514953) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.516543) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.516616) EVENT_LOG_v1 {"time_micros": 1764583542516604, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.516648) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1175183, prev total WAL file size 1175183, number of live WAL files 2.
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.517485) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(754KB)], [30(12MB)]
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542517518, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14281592, "oldest_snapshot_seqno": -1}
Dec  1 05:05:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:42 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4872 keys, 12091737 bytes, temperature: kUnknown
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542624840, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12091737, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12058735, "index_size": 19708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123218, "raw_average_key_size": 25, "raw_value_size": 11969671, "raw_average_value_size": 2456, "num_data_blocks": 819, "num_entries": 4872, "num_filter_entries": 4872, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.625661) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12091737 bytes
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.643147) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.0 rd, 112.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.9 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(34.2) write-amplify(15.7) OK, records in: 5382, records dropped: 510 output_compression: NoCompression
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.643745) EVENT_LOG_v1 {"time_micros": 1764583542643728, "job": 16, "event": "compaction_finished", "compaction_time_micros": 107419, "compaction_time_cpu_micros": 26784, "output_level": 6, "num_output_files": 1, "total_output_size": 12091737, "num_input_records": 5382, "num_output_records": 4872, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542644201, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542647472, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.517393) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.647603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.647609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.647611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.647613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:05:42.647615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:43 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:43.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:43 np0005540826 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  1 05:05:43 np0005540826 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.001s CPU time.
Dec  1 05:05:44 np0005540826 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  1 05:05:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:44.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100544 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:44 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:44 np0005540826 python3.9[194497]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100545 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:45 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:45.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:45 np0005540826 python3.9[194650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:46.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:46 np0005540826 python3.9[194774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583545.2184098-3525-114228304402753/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:46 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:47 np0005540826 python3.9[194926]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:47 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:47.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:47 np0005540826 python3.9[195078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:48 np0005540826 auditd[706]: Audit daemon rotating log files
Dec  1 05:05:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:48.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:48 np0005540826 python3.9[195157]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:48 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:05:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:49 np0005540826 python3.9[195309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:49 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:49.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:49 np0005540826 python3.9[195387]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.thxdp28p recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:50.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:50 np0005540826 python3.9[195540]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:50 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:50 np0005540826 python3.9[195618]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:51 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:51.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:51 np0005540826 python3.9[195770]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:52.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:52 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:52 np0005540826 python3[195924]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 05:05:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:05:53 np0005540826 python3.9[196076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:53 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:53.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:53 np0005540826 python3.9[196154]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:54.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:54 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:54 np0005540826 python3.9[196332]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:55 np0005540826 python3.9[196410]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:55 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e90004450 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:55.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:56 np0005540826 python3.9[196562]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:05:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:56.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:05:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:56 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:05:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:56 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:56 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:56 np0005540826 python3.9[196641]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e68004170 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:57 np0005540826 podman[196649]: 2025-12-01 10:05:57.367131589 +0000 UTC m=+0.098139784 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Dec  1 05:05:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:57 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:05:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:57.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:05:57 np0005540826 python3.9[196821]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:58.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:58 np0005540826 python3.9[196900]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:58 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:59 np0005540826 python3.9[197052]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:05:59 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:05:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:05:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:59.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:05:59 np0005540826 python3.9[197177]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583558.659408-3900-194050368624554/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:00 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:06:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:06:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:06:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:00 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:00 np0005540826 python3.9[197355]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:01 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e6c0044f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:01 np0005540826 python3.9[197507]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:01.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:02.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:02 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e8c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:02 np0005540826 python3.9[197663]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e78003b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[160655]: 01/12/2025 10:06:03 : epoch 692d67d9 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8e74002230 fd 39 proxy ignored for local
Dec  1 05:06:03 np0005540826 kernel: ganesha.nfsd[196650]: segfault at 50 ip 00007f8f4912132e sp 00007f8f0effc210 error 4 in libntirpc.so.5.8[7f8f49106000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 05:06:03 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:06:03 np0005540826 systemd[1]: Started Process Core Dump (PID 197816/UID 0).
Dec  1 05:06:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:03.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:03 np0005540826 python3.9[197815]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:04.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:04 np0005540826 python3.9[197971]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:06:04.537 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:06:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:06:04.538 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:06:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:06:04.538 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:06:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100605 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:05 np0005540826 systemd-coredump[197817]: Process 160684 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 66:#012#0  0x00007f8f4912132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:06:05 np0005540826 python3.9[198125]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:05 np0005540826 systemd[1]: systemd-coredump@6-197816-0.service: Deactivated successfully.
Dec  1 05:06:05 np0005540826 systemd[1]: systemd-coredump@6-197816-0.service: Consumed 1.911s CPU time.
Dec  1 05:06:05 np0005540826 podman[198133]: 2025-12-01 10:06:05.585136403 +0000 UTC m=+0.032335642 container died 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True)
Dec  1 05:06:05 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d4290427fc7952b359d9766abfabad9cb592511b7a0a73e255bf414248b1b3e5-merged.mount: Deactivated successfully.
Dec  1 05:06:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:05.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:05 np0005540826 podman[198133]: 2025-12-01 10:06:05.649716973 +0000 UTC m=+0.096916182 container remove 16cf7695b9f1b772a53f155c9ea432f313af7c84182dd3ee2dd859b9e0afc79d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:06:05 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:06:05 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:06:05 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 2.016s CPU time.
Dec  1 05:06:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:06 np0005540826 python3.9[198326]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:06.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100606 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:07 np0005540826 python3.9[198478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:07 np0005540826 podman[198573]: 2025-12-01 10:06:07.4984115 +0000 UTC m=+0.071835027 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 05:06:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:07.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:07 np0005540826 python3.9[198616]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583566.5411394-4116-180294452077178/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:08.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:08 np0005540826 python3.9[198773]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:08 np0005540826 python3.9[198896]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583567.908628-4161-193296103573429/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:09.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:10.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:10 np0005540826 python3.9[199049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:11 np0005540826 python3.9[199172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583569.9978874-4206-211726762671207/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100611 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:06:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:11.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:11 np0005540826 python3.9[199324]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:11 np0005540826 systemd[1]: Reloading.
Dec  1 05:06:12 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:12 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:12.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:12 np0005540826 systemd[1]: Reached target edpm_libvirt.target.
Dec  1 05:06:13 np0005540826 python3.9[199516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 05:06:13 np0005540826 systemd[1]: Reloading.
Dec  1 05:06:13 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:13 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:13 np0005540826 systemd[1]: Reloading.
Dec  1 05:06:13 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:13 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:14.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:14 np0005540826 systemd-logind[787]: Session 52 logged out. Waiting for processes to exit.
Dec  1 05:06:14 np0005540826 systemd[1]: session-52.scope: Deactivated successfully.
Dec  1 05:06:14 np0005540826 systemd[1]: session-52.scope: Consumed 3min 39.466s CPU time.
Dec  1 05:06:14 np0005540826 systemd-logind[787]: Removed session 52.
Dec  1 05:06:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:15 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 7.
Dec  1 05:06:15 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:06:15 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 2.016s CPU time.
Dec  1 05:06:15 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:06:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:16 np0005540826 podman[199662]: 2025-12-01 10:06:16.148415419 +0000 UTC m=+0.050689925 container create cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec  1 05:06:16 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d96b48475c5fcde35c29db491a525c229588f1523d0b50c13a3f6bbefc81b98d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:06:16 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d96b48475c5fcde35c29db491a525c229588f1523d0b50c13a3f6bbefc81b98d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:06:16 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d96b48475c5fcde35c29db491a525c229588f1523d0b50c13a3f6bbefc81b98d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:06:16 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d96b48475c5fcde35c29db491a525c229588f1523d0b50c13a3f6bbefc81b98d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:06:16 np0005540826 podman[199662]: 2025-12-01 10:06:16.216669523 +0000 UTC m=+0.118944019 container init cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 05:06:16 np0005540826 podman[199662]: 2025-12-01 10:06:16.124279548 +0000 UTC m=+0.026554054 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:06:16 np0005540826 podman[199662]: 2025-12-01 10:06:16.223334535 +0000 UTC m=+0.125609011 container start cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:06:16 np0005540826 bash[199662]: cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd
Dec  1 05:06:16 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:06:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:06:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:16.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:17.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:18.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:19.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:20 np0005540826 systemd-logind[787]: New session 53 of user zuul.
Dec  1 05:06:20 np0005540826 systemd[1]: Started Session 53 of User zuul.
Dec  1 05:06:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:20.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:21 np0005540826 python3.9[199901]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:06:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:21.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:22.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:22 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:06:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:22 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:06:22 np0005540826 python3.9[200056]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:06:22 np0005540826 network[200073]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:06:22 np0005540826 network[200074]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:06:22 np0005540826 network[200075]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:06:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:23.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:24.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:25.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:26.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:27 np0005540826 podman[200349]: 2025-12-01 10:06:27.551885875 +0000 UTC m=+0.130040413 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec  1 05:06:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:27.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:27 np0005540826 python3.9[200350]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:06:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:06:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:28.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:06:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:06:28 np0005540826 python3.9[200462]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:06:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:29 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:29 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:29.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 05:06:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:30.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 05:06:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:30 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:31 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100631 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:31 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:31.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:32.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:32 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:33 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:33 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:33.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:34.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:34 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:35 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:35 np0005540826 python3.9[200633]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:35 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:35.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:36.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:36 np0005540826 python3.9[200786]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:36 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:37 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:37 np0005540826 python3.9[200939]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:37 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:37.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:37 np0005540826 podman[201063]: 2025-12-01 10:06:37.981947849 +0000 UTC m=+0.061503062 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:06:38 np0005540826 python3.9[201112]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:38.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:38 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:39 np0005540826 python3.9[201267]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:39 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:39 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:39 np0005540826 python3.9[201390]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583598.4820485-246-4141105255443/.source.iscsi _original_basename=.g30ybbc4 follow=False checksum=af36d76b2f50ae0efe3fb0b12adf179eaae45a23 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 05:06:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:39.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 05:06:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:40.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:40 np0005540826 python3.9[201568]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:40 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:41 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:41 np0005540826 python3.9[201720]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:41 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:41.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:42.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:42 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:42 np0005540826 python3.9[201873]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:42 np0005540826 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  1 05:06:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:43 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:43 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:43.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:43 np0005540826 python3.9[202029]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:43 np0005540826 systemd[1]: Reloading.
Dec  1 05:06:43 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:43 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:44 np0005540826 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 05:06:44 np0005540826 systemd[1]: Starting Open-iSCSI...
Dec  1 05:06:44 np0005540826 kernel: Loading iSCSI transport class v2.0-870.
Dec  1 05:06:44 np0005540826 systemd[1]: Started Open-iSCSI.
Dec  1 05:06:44 np0005540826 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  1 05:06:44 np0005540826 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  1 05:06:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:44.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:44 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:45 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:45 np0005540826 python3.9[202230]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:06:45 np0005540826 network[202247]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:06:45 np0005540826 network[202248]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:06:45 np0005540826 network[202249]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:06:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:45 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:45.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:46.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:46 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:47 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:47 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:47.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:48.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:48 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:49 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:49 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:49.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:50.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:50 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:50 np0005540826 python3.9[202524]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 05:06:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:51 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:51 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:51.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:51 np0005540826 python3.9[202676]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  1 05:06:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:52.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:52 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:52 np0005540826 python3.9[202833]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:53 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:53 np0005540826 python3.9[202956]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583612.1249354-477-133914752227664/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:53 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:53.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:54 np0005540826 python3.9[203108]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:54.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:54 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:55 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:55 np0005540826 python3.9[203341]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:06:55 np0005540826 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 05:06:55 np0005540826 systemd[1]: Stopped Load Kernel Modules.
Dec  1 05:06:55 np0005540826 systemd[1]: Stopping Load Kernel Modules...
Dec  1 05:06:55 np0005540826 systemd[1]: Starting Load Kernel Modules...
Dec  1 05:06:55 np0005540826 systemd[1]: Finished Load Kernel Modules.
Dec  1 05:06:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:55 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:06:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:06:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:06:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:06:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:55.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:56 np0005540826 python3.9[203497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:06:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:56.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:56 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:56 np0005540826 python3.9[203650]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:57 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fe0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:57 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:57 np0005540826 python3.9[203802]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:57.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:58 np0005540826 podman[203829]: 2025-12-01 10:06:58.017346861 +0000 UTC m=+0.097847396 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  1 05:06:58 np0005540826 python3.9[203979]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:06:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:58.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:06:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:58 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:58 np0005540826 python3.9[204102]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583617.932248-651-186317616920989/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:59 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:06:59 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:59 np0005540826 python3.9[204254]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:06:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:06:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:59.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:00.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:00 np0005540826 python3.9[204408]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:00 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:07:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:07:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:01 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:01 np0005540826 python3.9[204611]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:01 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fcc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:01.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:01 np0005540826 python3.9[204763]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:02.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:02 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:02 np0005540826 python3.9[204916]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:03 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:03 np0005540826 python3.9[205069]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:03 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:03.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:04 np0005540826 python3.9[205221]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:04.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:07:04.537 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:07:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:07:04.539 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:07:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:07:04.539 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:07:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:04 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:04 np0005540826 python3.9[205374]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:05 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:05 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:05.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:05 np0005540826 python3.9[205526]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:06.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:06 np0005540826 python3.9[205681]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:06 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:07 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:07 np0005540826 python3.9[205833]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:07 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:07.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:08 np0005540826 podman[205957]: 2025-12-01 10:07:08.140161315 +0000 UTC m=+0.080798618 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  1 05:07:08 np0005540826 python3.9[206005]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:08.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:08 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:08 np0005540826 python3.9[206084]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:09 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:09 np0005540826 python3.9[206236]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:09 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:09.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:09 np0005540826 python3.9[206314]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:10.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:10 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:10 np0005540826 python3.9[206467]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:11 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:11 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:11 np0005540826 python3.9[206619]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:11.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:11 np0005540826 python3.9[206697]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:12.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:12 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:12 np0005540826 python3.9[206850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:13 np0005540826 python3.9[206928]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:13 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:13 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:13.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:14 np0005540826 python3.9[207080]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:14 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:14 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:14 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:14.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:14 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa004000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.884499) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634884556, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1152, "num_deletes": 256, "total_data_size": 2779650, "memory_usage": 2823856, "flush_reason": "Manual Compaction"}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634896303, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1786618, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18133, "largest_seqno": 19280, "table_properties": {"data_size": 1781656, "index_size": 2486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10397, "raw_average_key_size": 18, "raw_value_size": 1771576, "raw_average_value_size": 3157, "num_data_blocks": 111, "num_entries": 561, "num_filter_entries": 561, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583543, "oldest_key_time": 1764583543, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 11879 microseconds, and 5823 cpu microseconds.
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.896375) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1786618 bytes OK
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.896402) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.898122) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.898143) EVENT_LOG_v1 {"time_micros": 1764583634898137, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.898166) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2774030, prev total WAL file size 2774030, number of live WAL files 2.
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.899314) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1744KB)], [33(11MB)]
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634899352, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13878355, "oldest_snapshot_seqno": -1}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4907 keys, 13411381 bytes, temperature: kUnknown
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634960958, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13411381, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13377401, "index_size": 20622, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12293, "raw_key_size": 125126, "raw_average_key_size": 25, "raw_value_size": 13286937, "raw_average_value_size": 2707, "num_data_blocks": 846, "num_entries": 4907, "num_filter_entries": 4907, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.961424) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13411381 bytes
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.962927) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 224.3 rd, 216.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 11.5 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(15.3) write-amplify(7.5) OK, records in: 5433, records dropped: 526 output_compression: NoCompression
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.962948) EVENT_LOG_v1 {"time_micros": 1764583634962939, "job": 18, "event": "compaction_finished", "compaction_time_micros": 61877, "compaction_time_cpu_micros": 28178, "output_level": 6, "num_output_files": 1, "total_output_size": 13411381, "num_input_records": 5433, "num_output_records": 4907, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634963824, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634966280, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.899256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.966517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.966524) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.966526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.966528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:07:14.966530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:15 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:15 np0005540826 python3.9[207270]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:15 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:15.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:15 np0005540826 python3.9[207348]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:16.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:16 np0005540826 python3.9[207501]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:16 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:17 np0005540826 python3.9[207579]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:17 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa0040095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:17 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:17.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:18 np0005540826 python3.9[207731]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:18 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:18 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:18 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:18 np0005540826 systemd[1]: Starting Create netns directory...
Dec  1 05:07:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:18.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:18 np0005540826 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 05:07:18 np0005540826 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 05:07:18 np0005540826 systemd[1]: Finished Create netns directory.
Dec  1 05:07:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:18 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:19 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:19 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa0040095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:19 np0005540826 python3.9[207926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.004000103s ======
Dec  1 05:07:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:19.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000103s
Dec  1 05:07:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:20.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:20 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:20 np0005540826 python3.9[208102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:21 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:21 np0005540826 python3.9[208227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583640.165885-1272-235945907836332/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:21 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:21.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:22 np0005540826 python3.9[208380]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:22.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:22 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa0040095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:23 np0005540826 python3.9[208532]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:23 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:23 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa0040095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:23.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:23 np0005540826 python3.9[208655]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583642.72234-1347-247797055445060/.source.json _original_basename=.9yyxr71g follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:24.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:24 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:25 np0005540826 python3.9[208808]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100725 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:07:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:25 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:25 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:25.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:26.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:26 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa0040095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:27 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd4004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:27 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fd8004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:27 np0005540826 python3.9[209236]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  1 05:07:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:27.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:28.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:28 np0005540826 podman[209361]: 2025-12-01 10:07:28.603706064 +0000 UTC m=+0.098660338 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:07:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:28 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9fec003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:28 np0005540826 python3.9[209406]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:07:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:29 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa00400a640 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:29 np0005540826 kernel: ganesha.nfsd[204917]: segfault at 50 ip 00007fa0b095632e sp 00007fa07fffe210 error 4 in libntirpc.so.5.8[7fa0b093b000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 05:07:29 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:07:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[199679]: 01/12/2025 10:07:29 : epoch 692d6898 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa00400a640 fd 38 proxy ignored for local
Dec  1 05:07:29 np0005540826 systemd[1]: Started Process Core Dump (PID 209540/UID 0).
Dec  1 05:07:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:29.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:29 np0005540826 python3.9[209570]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 05:07:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:30.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:30 np0005540826 systemd-coredump[209550]: Process 199683 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007fa0b095632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:07:30 np0005540826 systemd[1]: systemd-coredump@7-209540-0.service: Deactivated successfully.
Dec  1 05:07:30 np0005540826 systemd[1]: systemd-coredump@7-209540-0.service: Consumed 1.225s CPU time.
Dec  1 05:07:30 np0005540826 podman[209626]: 2025-12-01 10:07:30.938383436 +0000 UTC m=+0.030125016 container died cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:07:30 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d96b48475c5fcde35c29db491a525c229588f1523d0b50c13a3f6bbefc81b98d-merged.mount: Deactivated successfully.
Dec  1 05:07:30 np0005540826 podman[209626]: 2025-12-01 10:07:30.999503997 +0000 UTC m=+0.091245567 container remove cc2e7494767935faa7d23c9c04148fcd19bfe6c3ed9edb784e0d0623d2a355bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  1 05:07:31 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:07:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:31 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:07:31 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.657s CPU time.
Dec  1 05:07:31 np0005540826 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  1 05:07:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:31.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:32 np0005540826 python3[209797]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:07:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:32.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:32 np0005540826 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  1 05:07:33 np0005540826 podman[209813]: 2025-12-01 10:07:33.484742459 +0000 UTC m=+1.263199117 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:33 np0005540826 podman[209871]: 2025-12-01 10:07:33.673878831 +0000 UTC m=+0.085512069 container create 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  1 05:07:33 np0005540826 podman[209871]: 2025-12-01 10:07:33.612050962 +0000 UTC m=+0.023684220 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:33 np0005540826 python3[209797]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:33.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:34.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:35 np0005540826 python3.9[210063]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100735 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:07:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:35.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:36 np0005540826 python3.9[210217]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:36.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:36 np0005540826 python3.9[210294]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:37 np0005540826 python3.9[210445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583656.665588-1611-130073396747571/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:37.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:37 np0005540826 python3.9[210521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:07:37 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:38 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:38 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:38 np0005540826 podman[210558]: 2025-12-01 10:07:38.381015504 +0000 UTC m=+0.066975741 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  1 05:07:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:38.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:38 np0005540826 python3.9[210653]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:39 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:39 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:39 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:39 np0005540826 systemd[1]: Starting multipathd container...
Dec  1 05:07:39 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:07:39 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c1f5262e41eec3c341a513a837430cda7696397b702a9178172238b28ee5890/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:39 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c1f5262e41eec3c341a513a837430cda7696397b702a9178172238b28ee5890/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:39 np0005540826 systemd[1]: Started /usr/bin/podman healthcheck run 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136.
Dec  1 05:07:39 np0005540826 podman[210693]: 2025-12-01 10:07:39.516164402 +0000 UTC m=+0.134186447 container init 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec  1 05:07:39 np0005540826 multipathd[210708]: + sudo -E kolla_set_configs
Dec  1 05:07:39 np0005540826 podman[210693]: 2025-12-01 10:07:39.543527661 +0000 UTC m=+0.161549695 container start 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:39 np0005540826 podman[210693]: multipathd
Dec  1 05:07:39 np0005540826 systemd[1]: Started multipathd container.
Dec  1 05:07:39 np0005540826 multipathd[210708]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:07:39 np0005540826 multipathd[210708]: INFO:__main__:Validating config file
Dec  1 05:07:39 np0005540826 multipathd[210708]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:07:39 np0005540826 multipathd[210708]: INFO:__main__:Writing out command to execute
Dec  1 05:07:39 np0005540826 multipathd[210708]: ++ cat /run_command
Dec  1 05:07:39 np0005540826 multipathd[210708]: + CMD='/usr/sbin/multipathd -d'
Dec  1 05:07:39 np0005540826 multipathd[210708]: + ARGS=
Dec  1 05:07:39 np0005540826 multipathd[210708]: + sudo kolla_copy_cacerts
Dec  1 05:07:39 np0005540826 podman[210714]: 2025-12-01 10:07:39.621520292 +0000 UTC m=+0.063085811 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec  1 05:07:39 np0005540826 systemd[1]: 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-47af635b6ffc8092.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 05:07:39 np0005540826 systemd[1]: 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-47af635b6ffc8092.service: Failed with result 'exit-code'.
Dec  1 05:07:39 np0005540826 multipathd[210708]: + [[ ! -n '' ]]
Dec  1 05:07:39 np0005540826 multipathd[210708]: + . kolla_extend_start
Dec  1 05:07:39 np0005540826 multipathd[210708]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 05:07:39 np0005540826 multipathd[210708]: Running command: '/usr/sbin/multipathd -d'
Dec  1 05:07:39 np0005540826 multipathd[210708]: + umask 0022
Dec  1 05:07:39 np0005540826 multipathd[210708]: + exec /usr/sbin/multipathd -d
Dec  1 05:07:39 np0005540826 multipathd[210708]: 3522.624938 | --------start up--------
Dec  1 05:07:39 np0005540826 multipathd[210708]: 3522.624956 | read /etc/multipath.conf
Dec  1 05:07:39 np0005540826 multipathd[210708]: 3522.631739 | path checkers start up
Dec  1 05:07:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:39.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:40.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:40 np0005540826 python3.9[210899]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:41 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 8.
Dec  1 05:07:41 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:07:41 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.657s CPU time.
Dec  1 05:07:41 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:07:41 np0005540826 podman[211121]: 2025-12-01 10:07:41.452998308 +0000 UTC m=+0.042514986 container create f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:07:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dab35661d023c3ff53e8f7f11947cf9bddc11449e3b5caf4dab75ef1e99ae8ce/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:41 np0005540826 python3.9[211090]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:07:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dab35661d023c3ff53e8f7f11947cf9bddc11449e3b5caf4dab75ef1e99ae8ce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dab35661d023c3ff53e8f7f11947cf9bddc11449e3b5caf4dab75ef1e99ae8ce/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dab35661d023c3ff53e8f7f11947cf9bddc11449e3b5caf4dab75ef1e99ae8ce/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:41 np0005540826 podman[211121]: 2025-12-01 10:07:41.518697706 +0000 UTC m=+0.108214404 container init f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:07:41 np0005540826 podman[211121]: 2025-12-01 10:07:41.524202266 +0000 UTC m=+0.113718944 container start f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:07:41 np0005540826 bash[211121]: f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad
Dec  1 05:07:41 np0005540826 podman[211121]: 2025-12-01 10:07:41.434110446 +0000 UTC m=+0.023627144 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:07:41 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:07:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:41 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:07:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:41.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:42.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:42 np0005540826 python3.9[211344]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:42 np0005540826 systemd[1]: Stopping multipathd container...
Dec  1 05:07:42 np0005540826 multipathd[210708]: 3525.598649 | exit (signal)
Dec  1 05:07:42 np0005540826 multipathd[210708]: 3525.598742 | --------shut down-------
Dec  1 05:07:42 np0005540826 systemd[1]: libpod-9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136.scope: Deactivated successfully.
Dec  1 05:07:42 np0005540826 podman[211348]: 2025-12-01 10:07:42.656968984 +0000 UTC m=+0.069695889 container died 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  1 05:07:42 np0005540826 systemd[1]: 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-47af635b6ffc8092.timer: Deactivated successfully.
Dec  1 05:07:42 np0005540826 systemd[1]: Stopped /usr/bin/podman healthcheck run 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136.
Dec  1 05:07:42 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-userdata-shm.mount: Deactivated successfully.
Dec  1 05:07:42 np0005540826 systemd[1]: var-lib-containers-storage-overlay-7c1f5262e41eec3c341a513a837430cda7696397b702a9178172238b28ee5890-merged.mount: Deactivated successfully.
Dec  1 05:07:42 np0005540826 podman[211348]: 2025-12-01 10:07:42.84443781 +0000 UTC m=+0.257164715 container cleanup 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:42 np0005540826 podman[211348]: multipathd
Dec  1 05:07:42 np0005540826 podman[211375]: multipathd
Dec  1 05:07:42 np0005540826 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  1 05:07:42 np0005540826 systemd[1]: Stopped multipathd container.
Dec  1 05:07:42 np0005540826 systemd[1]: Starting multipathd container...
Dec  1 05:07:43 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:07:43 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c1f5262e41eec3c341a513a837430cda7696397b702a9178172238b28ee5890/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:43 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c1f5262e41eec3c341a513a837430cda7696397b702a9178172238b28ee5890/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:43 np0005540826 systemd[1]: Started /usr/bin/podman healthcheck run 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136.
Dec  1 05:07:43 np0005540826 podman[211388]: 2025-12-01 10:07:43.071440305 +0000 UTC m=+0.128457260 container init 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:43 np0005540826 multipathd[211403]: + sudo -E kolla_set_configs
Dec  1 05:07:43 np0005540826 podman[211388]: 2025-12-01 10:07:43.096289189 +0000 UTC m=+0.153306114 container start 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 05:07:43 np0005540826 podman[211388]: multipathd
Dec  1 05:07:43 np0005540826 systemd[1]: Started multipathd container.
Dec  1 05:07:43 np0005540826 multipathd[211403]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:07:43 np0005540826 multipathd[211403]: INFO:__main__:Validating config file
Dec  1 05:07:43 np0005540826 multipathd[211403]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:07:43 np0005540826 multipathd[211403]: INFO:__main__:Writing out command to execute
Dec  1 05:07:43 np0005540826 multipathd[211403]: ++ cat /run_command
Dec  1 05:07:43 np0005540826 multipathd[211403]: + CMD='/usr/sbin/multipathd -d'
Dec  1 05:07:43 np0005540826 multipathd[211403]: + ARGS=
Dec  1 05:07:43 np0005540826 multipathd[211403]: + sudo kolla_copy_cacerts
Dec  1 05:07:43 np0005540826 multipathd[211403]: + [[ ! -n '' ]]
Dec  1 05:07:43 np0005540826 multipathd[211403]: + . kolla_extend_start
Dec  1 05:07:43 np0005540826 multipathd[211403]: Running command: '/usr/sbin/multipathd -d'
Dec  1 05:07:43 np0005540826 multipathd[211403]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 05:07:43 np0005540826 multipathd[211403]: + umask 0022
Dec  1 05:07:43 np0005540826 multipathd[211403]: + exec /usr/sbin/multipathd -d
Dec  1 05:07:43 np0005540826 podman[211411]: 2025-12-01 10:07:43.201229268 +0000 UTC m=+0.094818351 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:07:43 np0005540826 systemd[1]: 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-5d137b323e0a60b0.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 05:07:43 np0005540826 systemd[1]: 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136-5d137b323e0a60b0.service: Failed with result 'exit-code'.
Dec  1 05:07:43 np0005540826 multipathd[211403]: 3526.187533 | --------start up--------
Dec  1 05:07:43 np0005540826 multipathd[211403]: 3526.187554 | read /etc/multipath.conf
Dec  1 05:07:43 np0005540826 multipathd[211403]: 3526.196572 | path checkers start up
Dec  1 05:07:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:43.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:43 np0005540826 python3.9[211595]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:44 np0005540826 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 05:07:44 np0005540826 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  1 05:07:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:44.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:45 np0005540826 python3.9[211750]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 05:07:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:45.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:45 np0005540826 python3.9[211902]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  1 05:07:45 np0005540826 kernel: Key type psk registered
Dec  1 05:07:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:46.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:46 np0005540826 python3.9[212066]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:47 np0005540826 python3.9[212189]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583666.189447-1851-97561705677104/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:07:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:47 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:47.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:48 np0005540826 python3.9[212341]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:48.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:48 np0005540826 python3.9[212494]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:49 np0005540826 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 05:07:49 np0005540826 systemd[1]: Stopped Load Kernel Modules.
Dec  1 05:07:49 np0005540826 systemd[1]: Stopping Load Kernel Modules...
Dec  1 05:07:49 np0005540826 systemd[1]: Starting Load Kernel Modules...
Dec  1 05:07:49 np0005540826 systemd[1]: Finished Load Kernel Modules.
Dec  1 05:07:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:49.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:50 np0005540826 python3.9[212650]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:07:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:50.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100751 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:07:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:51.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:52.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:52 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:52 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:52 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:52 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:53 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:53 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:53 np0005540826 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 05:07:53 np0005540826 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 05:07:53 np0005540826 lvm[212765]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:07:53 np0005540826 lvm[212765]: VG ceph_vg0 finished
Dec  1 05:07:53 np0005540826 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 05:07:53 np0005540826 systemd[1]: Starting man-db-cache-update.service...
Dec  1 05:07:53 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000019:nfs.cephfs.0: -2
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:07:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:53 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:07:53 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:53 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:53.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:54 np0005540826 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 05:07:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:54.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:54 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98b0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:55 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:55 np0005540826 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 05:07:55 np0005540826 systemd[1]: Finished man-db-cache-update.service.
Dec  1 05:07:55 np0005540826 systemd[1]: man-db-cache-update.service: Consumed 1.689s CPU time.
Dec  1 05:07:55 np0005540826 systemd[1]: run-r0eba276f57624c34831b4b4db81bb4e0.service: Deactivated successfully.
Dec  1 05:07:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:55 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9884000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:55 np0005540826 python3.9[214123]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:55 np0005540826 systemd[1]: Stopping Open-iSCSI...
Dec  1 05:07:55 np0005540826 iscsid[202069]: iscsid shutting down.
Dec  1 05:07:55 np0005540826 systemd[1]: iscsid.service: Deactivated successfully.
Dec  1 05:07:55 np0005540826 systemd[1]: Stopped Open-iSCSI.
Dec  1 05:07:55 np0005540826 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 05:07:55 np0005540826 systemd[1]: Starting Open-iSCSI...
Dec  1 05:07:55 np0005540826 systemd[1]: Started Open-iSCSI.
Dec  1 05:07:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:07:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:55.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:07:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:56.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:56 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9880000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:56 np0005540826 python3.9[214279]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:07:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:57 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f988c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100757 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:07:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:57 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a80025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:57 np0005540826 python3.9[214435]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:57.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:58.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:58 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:58 np0005540826 podman[214560]: 2025-12-01 10:07:58.801024947 +0000 UTC m=+0.105553346 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:07:59 np0005540826 python3.9[214606]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:07:59 np0005540826 systemd[1]: Reloading.
Dec  1 05:07:59 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:59 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:59 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a80025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:07:59 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:07:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:59.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:00 np0005540826 python3.9[214799]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:08:00 np0005540826 network[214817]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:08:00 np0005540826 network[214818]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:08:00 np0005540826 network[214819]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:08:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:00.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:00 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f988c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:01 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:01 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a80025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:01.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:02 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:02.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:02 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:08:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:08:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:03 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:03 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a80025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:03.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:04.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:08:04.539 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:08:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:08:04.544 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:08:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:08:04.544 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:08:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:04 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f988c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:05 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:05 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a80025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:05.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:06 np0005540826 python3.9[215272]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:06.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:06 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9884002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:07 np0005540826 python3.9[215426]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:07 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f988c002580 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:07 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9880002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:07.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:07 np0005540826 python3.9[215579]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:08.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:08 np0005540826 python3.9[215758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:08 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:08 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:08 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98a8003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:08 np0005540826 podman[215760]: 2025-12-01 10:08:08.786659931 +0000 UTC m=+0.068094249 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:08:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:09 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9884002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100809 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:09 np0005540826 python3.9[215929]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:09 np0005540826 kernel: ganesha.nfsd[212797]: segfault at 50 ip 00007f995b45032e sp 00007f9925ffa210 error 4 in libntirpc.so.5.8[7f995b435000+2c000] likely on CPU 7 (core 0, socket 7)
Dec  1 05:08:09 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:08:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[211137]: 01/12/2025 10:08:09 : epoch 692d68ed : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f988c003340 fd 38 proxy ignored for local
Dec  1 05:08:09 np0005540826 systemd[1]: Started Process Core Dump (PID 215949/UID 0).
Dec  1 05:08:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:10 np0005540826 python3.9[216084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:10 np0005540826 systemd-coredump[215956]: Process 211142 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 44:#012#0  0x00007f995b45032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:08:11 np0005540826 systemd[1]: systemd-coredump@8-215949-0.service: Deactivated successfully.
Dec  1 05:08:11 np0005540826 systemd[1]: systemd-coredump@8-215949-0.service: Consumed 1.336s CPU time.
Dec  1 05:08:11 np0005540826 python3.9[216238]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:11 np0005540826 podman[216243]: 2025-12-01 10:08:11.074662653 +0000 UTC m=+0.031993658 container died f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:08:11 np0005540826 systemd[1]: var-lib-containers-storage-overlay-dab35661d023c3ff53e8f7f11947cf9bddc11449e3b5caf4dab75ef1e99ae8ce-merged.mount: Deactivated successfully.
Dec  1 05:08:11 np0005540826 podman[216243]: 2025-12-01 10:08:11.118204304 +0000 UTC m=+0.075535289 container remove f0f91343ccffed1c19232c65117ccfcfea511a270d1f429d5ffaff9654169fad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec  1 05:08:11 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:08:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:11 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:08:11 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.593s CPU time.
Dec  1 05:08:11 np0005540826 python3.9[216439]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:11.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:13.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:13 np0005540826 podman[216466]: 2025-12-01 10:08:13.989208118 +0000 UTC m=+0.073072856 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:08:14 np0005540826 python3.9[216614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:14.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:15 np0005540826 python3.9[216766]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100815 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:15 np0005540826 python3.9[216918]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:16 np0005540826 python3.9[217071]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:16.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:16 np0005540826 python3.9[217223]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:17 np0005540826 python3.9[217375]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:18 np0005540826 python3.9[217527]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:18.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:19 np0005540826 python3.9[217680]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:19.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:20.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:20 np0005540826 python3.9[217833]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:21 np0005540826 python3.9[218010]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:21 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 9.
Dec  1 05:08:21 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:08:21 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.593s CPU time.
Dec  1 05:08:21 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:08:21 np0005540826 podman[218141]: 2025-12-01 10:08:21.615748708 +0000 UTC m=+0.042694031 container create d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 05:08:21 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f89c4fc98e4b5cea678a1e123388c7134f645c2ef19bfd26433be98a5b7eb5c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:21 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f89c4fc98e4b5cea678a1e123388c7134f645c2ef19bfd26433be98a5b7eb5c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:21 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f89c4fc98e4b5cea678a1e123388c7134f645c2ef19bfd26433be98a5b7eb5c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:21 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f89c4fc98e4b5cea678a1e123388c7134f645c2ef19bfd26433be98a5b7eb5c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:21 np0005540826 podman[218141]: 2025-12-01 10:08:21.675952885 +0000 UTC m=+0.102898228 container init d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:08:21 np0005540826 podman[218141]: 2025-12-01 10:08:21.682869101 +0000 UTC m=+0.109814424 container start d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:08:21 np0005540826 bash[218141]: d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5
Dec  1 05:08:21 np0005540826 podman[218141]: 2025-12-01 10:08:21.594907546 +0000 UTC m=+0.021852889 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:08:21 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:08:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:21.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:21 np0005540826 python3.9[218245]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:22.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:22 np0005540826 python3.9[218420]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:23 np0005540826 python3.9[218572]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:23 np0005540826 python3.9[218724]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:24 np0005540826 python3.9[218877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:24.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:24 np0005540826 python3.9[219029]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:26.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:08:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 7887 writes, 31K keys, 7887 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7887 writes, 1549 syncs, 5.09 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 761 writes, 1336 keys, 761 commit groups, 1.0 writes per commit group, ingest: 0.56 MB, 0.00 MB/s#012Interval WAL: 761 writes, 374 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 05:08:26 np0005540826 python3.9[219182]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:08:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:27 np0005540826 python3.9[219334]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:08:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:28.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:28 np0005540826 python3.9[219487]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:08:28 np0005540826 systemd[1]: Reloading.
Dec  1 05:08:29 np0005540826 podman[219488]: 2025-12-01 10:08:29.024320982 +0000 UTC m=+0.105911765 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:08:29 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:08:29 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:08:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:29.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:30 np0005540826 python3.9[219700]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:30.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:30 np0005540826 python3.9[219856]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100831 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:08:31 np0005540826 python3.9[220009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:32 np0005540826 python3.9[220162]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:32.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:32 np0005540826 python3.9[220316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:33 np0005540826 python3.9[220469]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:33 np0005540826 python3.9[220622]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:33.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000001b:nfs.cephfs.0: -2
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:08:34 np0005540826 python3.9[220789]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:34.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7950000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:35 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:35 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:35.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:36.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:36 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:37 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:37 np0005540826 python3.9[220946]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100837 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:08:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:37 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:37.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:38 np0005540826 python3.9[221098]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:38.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:38 np0005540826 python3.9[221251]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:38 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:38 np0005540826 podman[221276]: 2025-12-01 10:08:38.969175986 +0000 UTC m=+0.051305411 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 05:08:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:39 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f79280016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:39 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:39 np0005540826 python3.9[221423]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:39.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:40 np0005540826 python3.9[221576]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:40.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:40 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:40 np0005540826 python3.9[221728]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:41 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:41 np0005540826 python3.9[221905]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:41 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:08:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:41.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:08:42 np0005540826 python3.9[222057]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:42.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:42 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:42 np0005540826 python3.9[222210]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:43 np0005540826 python3.9[222362]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:43 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:43 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:43.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:44.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:44 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:44 np0005540826 podman[222388]: 2025-12-01 10:08:44.979864745 +0000 UTC m=+0.056166168 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:08:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:45 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:45 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:45.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:46.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:46 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:47 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:47 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:47.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:48.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:48 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:49 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:49 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:49.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:49 np0005540826 python3.9[222537]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  1 05:08:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:50.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:50 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:50 np0005540826 python3.9[222691]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 05:08:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:51 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:51 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:51.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:52 np0005540826 python3.9[222849]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 05:08:52 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:08:52 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:08:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:08:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:52.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:08:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100852 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:52 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f79280032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:53 np0005540826 systemd-logind[787]: New session 54 of user zuul.
Dec  1 05:08:53 np0005540826 systemd[1]: Started Session 54 of User zuul.
Dec  1 05:08:53 np0005540826 systemd[1]: session-54.scope: Deactivated successfully.
Dec  1 05:08:53 np0005540826 systemd-logind[787]: Session 54 logged out. Waiting for processes to exit.
Dec  1 05:08:53 np0005540826 systemd-logind[787]: Removed session 54.
Dec  1 05:08:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:53 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:53 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:53.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:54 np0005540826 python3.9[223037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:54 np0005540826 python3.9[223159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583733.5534978-3434-234096235154848/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:54 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:55 np0005540826 python3.9[223309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:55 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f79280032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:55 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:55 np0005540826 python3.9[223385]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:55.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:56 np0005540826 python3.9[223535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:08:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:08:56 np0005540826 python3.9[223657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583735.765744-3434-100895522773385/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:56 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:57 np0005540826 ceph-osd[77525]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000049s
Dec  1 05:08:57 np0005540826 python3.9[223807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:57 np0005540826 python3.9[223928]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583736.9027011-3434-175956608776608/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:57.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:08:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:58.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:08:58 np0005540826 python3.9[224079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:58 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:59 np0005540826 python3.9[224200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583738.2600121-3434-168649658460478/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:59 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:08:59 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:59 np0005540826 podman[224324]: 2025-12-01 10:08:59.730143401 +0000 UTC m=+0.089870330 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:08:59 np0005540826 python3.9[224363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:08:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:08:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:59.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:00 np0005540826 python3.9[224498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583739.396185-3434-128871807682571/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:00 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:09:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:00.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:00 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:01 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:01 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:01.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:02.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:02 np0005540826 python3.9[224676]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:02 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:09:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:09:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:09:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:03 np0005540826 python3.9[224828]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:04 np0005540826 python3.9[224981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:09:04.540 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:09:04.540 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:09:04.541 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:04 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:05 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:05 np0005540826 python3.9[225133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:05 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:05 np0005540826 python3.9[225256]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764583744.944266-3755-156709826427916/.source _original_basename=.sxdbxfo9 follow=False checksum=9f321370b7f95ad3b56102ad7a7be35e8ed6914a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  1 05:09:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:06 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:09:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:06.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:06 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:07 np0005540826 python3.9[225412]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:07 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:07 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:07 np0005540826 python3.9[225564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:07.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:08 np0005540826 python3.9[225736]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583747.4398968-3834-269745316546955/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:08 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:09 np0005540826 podman[225891]: 2025-12-01 10:09:09.251976837 +0000 UTC m=+0.058741655 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  1 05:09:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:09 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:09 np0005540826 python3.9[225930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:09:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:09 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:09 np0005540826 python3.9[226056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583748.7702444-3879-51006940728855/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:09.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:10.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:10 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:11 np0005540826 python3.9[226209]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  1 05:09:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:11 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:11 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:11.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:11 np0005540826 python3.9[226361]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:09:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100912 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:09:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:12 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:13 np0005540826 python3[226514]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:09:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:13 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:13 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:13.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.571866) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754571906, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1431, "num_deletes": 251, "total_data_size": 3685466, "memory_usage": 3734344, "flush_reason": "Manual Compaction"}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754584601, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2398176, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19285, "largest_seqno": 20711, "table_properties": {"data_size": 2391991, "index_size": 3448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12805, "raw_average_key_size": 19, "raw_value_size": 2379753, "raw_average_value_size": 3683, "num_data_blocks": 152, "num_entries": 646, "num_filter_entries": 646, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583635, "oldest_key_time": 1764583635, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 12773 microseconds, and 5391 cpu microseconds.
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.584640) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2398176 bytes OK
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.584655) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.585791) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.585806) EVENT_LOG_v1 {"time_micros": 1764583754585802, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.585825) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3678774, prev total WAL file size 3714399, number of live WAL files 2.
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.588892) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2341KB)], [36(12MB)]
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754588922, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15809557, "oldest_snapshot_seqno": -1}
Dec  1 05:09:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:14.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5033 keys, 13611256 bytes, temperature: kUnknown
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754658503, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13611256, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13576317, "index_size": 21261, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128203, "raw_average_key_size": 25, "raw_value_size": 13483453, "raw_average_value_size": 2679, "num_data_blocks": 874, "num_entries": 5033, "num_filter_entries": 5033, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.658775) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13611256 bytes
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.660144) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 226.9 rd, 195.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.8 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(12.3) write-amplify(5.7) OK, records in: 5553, records dropped: 520 output_compression: NoCompression
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.660166) EVENT_LOG_v1 {"time_micros": 1764583754660154, "job": 20, "event": "compaction_finished", "compaction_time_micros": 69668, "compaction_time_cpu_micros": 26164, "output_level": 6, "num_output_files": 1, "total_output_size": 13611256, "num_input_records": 5553, "num_output_records": 5033, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754660646, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754663847, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.588830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.663894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.663899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.663900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.663902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:09:14.663904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:14 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:15 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:15 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:15.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:16.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:16 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:17 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:17 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:17.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:18.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:18 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:19 np0005540826 podman[226591]: 2025-12-01 10:09:19.393218424 +0000 UTC m=+3.762283579 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2)
Dec  1 05:09:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:19 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:09:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3639 writes, 20K keys, 3639 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3639 writes, 3639 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1435 writes, 6943 keys, 1435 commit groups, 1.0 writes per commit group, ingest: 16.87 MB, 0.03 MB/s#012Interval WAL: 1435 writes, 1435 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    131.3      0.23              0.09        10    0.023       0      0       0.0       0.0#012  L6      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.7    188.4    163.1      0.71              0.31         9    0.079     44K   4683       0.0       0.0#012 Sum      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.7    141.4    155.2      0.94              0.40        19    0.050     44K   4683       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.6    153.8    153.3      0.39              0.14         8    0.048     22K   2385       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    188.4    163.1      0.71              0.31         9    0.079     44K   4683       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    132.3      0.23              0.09         9    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 0.9 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d4317d9350#2 capacity: 304.00 MB usage: 7.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000113 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(407,7.18 MB,2.36336%) FilterBlock(19,131.05 KB,0.0420972%) IndexBlock(19,245.52 KB,0.0788689%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 05:09:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:19 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:19.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:20.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:20 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:21 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:21.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:22.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:22 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:23 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:23 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:23.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:24 np0005540826 podman[226527]: 2025-12-01 10:09:24.183277829 +0000 UTC m=+11.004098052 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:24 np0005540826 podman[226682]: 2025-12-01 10:09:24.337655839 +0000 UTC m=+0.053061059 container create a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:09:24 np0005540826 podman[226682]: 2025-12-01 10:09:24.307583817 +0000 UTC m=+0.022989047 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:24 np0005540826 python3[226514]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  1 05:09:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:24.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:24 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:24 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:25 np0005540826 python3.9[226872]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:25 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:25 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:25.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:26 np0005540826 python3.9[227027]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  1 05:09:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:26.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:26 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:26 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:27 np0005540826 python3.9[227179]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:09:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:27 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:27.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:28 np0005540826 python3[227332]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:09:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:28.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:28 np0005540826 podman[227370]: 2025-12-01 10:09:28.693632615 +0000 UTC m=+0.049791198 container create cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm)
Dec  1 05:09:28 np0005540826 podman[227370]: 2025-12-01 10:09:28.67023336 +0000 UTC m=+0.026391963 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:28 np0005540826 python3[227332]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  1 05:09:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:28 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:29 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:29 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:29 np0005540826 python3.9[227558]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:29.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:30 np0005540826 podman[227561]: 2025-12-01 10:09:30.014197168 +0000 UTC m=+0.091852043 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  1 05:09:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:30.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:30 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:31 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:31 np0005540826 python3.9[227739]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:31 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:31.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:32 np0005540826 python3.9[227890]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583771.673629-4154-107854977397289/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:32.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:32 np0005540826 python3.9[227967]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:09:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:32 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:32 np0005540826 systemd[1]: Reloading.
Dec  1 05:09:32 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:09:32 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:09:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:33 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:33 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:33 np0005540826 python3.9[228078]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:09:33 np0005540826 systemd[1]: Reloading.
Dec  1 05:09:33 np0005540826 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:09:33 np0005540826 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:09:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:33.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:34 np0005540826 systemd[1]: Starting nova_compute container...
Dec  1 05:09:34 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:09:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540826 podman[228118]: 2025-12-01 10:09:34.243568753 +0000 UTC m=+0.116376654 container init cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:09:34 np0005540826 podman[228118]: 2025-12-01 10:09:34.250026845 +0000 UTC m=+0.122834716 container start cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  1 05:09:34 np0005540826 podman[228118]: nova_compute
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + sudo -E kolla_set_configs
Dec  1 05:09:34 np0005540826 systemd[1]: Started nova_compute container.
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Validating config file
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying service configuration files
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Deleting /etc/ceph
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Creating directory /etc/ceph
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Writing out command to execute
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540826 nova_compute[228134]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540826 nova_compute[228134]: ++ cat /run_command
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + CMD=nova-compute
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + ARGS=
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + sudo kolla_copy_cacerts
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + [[ ! -n '' ]]
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + . kolla_extend_start
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 05:09:34 np0005540826 nova_compute[228134]: Running command: 'nova-compute'
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + umask 0022
Dec  1 05:09:34 np0005540826 nova_compute[228134]: + exec nova-compute
Dec  1 05:09:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:34.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:34 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:35 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:35 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:35 np0005540826 python3.9[228296]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:35.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.612 228138 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.613 228138 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.613 228138 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.614 228138 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 05:09:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:36.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.768 228138 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.797 228138 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:36 np0005540826 nova_compute[228134]: 2025-12-01 10:09:36.797 228138 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 05:09:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:36 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:36 np0005540826 python3.9[228449]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:37 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7944002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.527 228138 INFO nova.virt.driver [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 05:09:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:37 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.652 228138 INFO nova.compute.provider_config [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.682 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.682 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.683 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.683 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.683 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.684 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.684 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.685 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.685 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.685 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.685 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.685 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.686 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.686 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.686 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.686 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.687 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.687 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.687 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.687 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.688 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.688 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.688 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.688 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.689 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.689 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.689 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.690 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.690 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.690 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.691 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.691 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.692 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.692 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.692 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.692 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.693 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.693 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.693 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.693 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.694 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.694 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.694 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.694 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.695 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.695 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.695 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.695 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.696 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.696 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.696 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.696 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.697 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.697 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.697 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.697 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.698 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.698 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.698 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.698 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.699 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.699 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.699 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.699 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.699 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.700 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.700 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.700 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.700 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.701 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.701 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.701 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.701 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.702 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.702 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.702 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.702 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.703 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.703 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.703 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.703 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.704 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.704 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.704 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.704 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.705 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.705 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.705 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.705 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.706 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.706 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.706 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.706 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.707 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.707 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.707 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.707 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.708 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.708 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.708 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.708 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.709 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.709 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.709 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.709 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.710 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.710 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.710 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.710 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.711 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.711 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.711 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.711 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.712 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.712 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.712 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.712 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.713 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.713 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.713 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.714 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.714 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.714 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.714 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.714 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.715 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.715 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.715 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.715 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.716 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.716 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.716 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.717 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.717 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.717 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.717 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.717 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.718 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.718 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.718 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.718 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.719 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.719 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.719 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.719 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.720 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.720 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.720 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.721 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.721 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.721 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.721 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.721 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.722 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.722 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.722 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.722 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.723 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.723 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.723 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.724 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.724 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.724 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.724 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.725 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.725 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.725 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.725 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.725 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.726 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.726 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.726 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.726 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.727 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.727 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.727 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.727 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.728 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.728 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.728 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.728 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.729 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.729 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.729 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.729 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.730 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.730 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.730 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.730 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.731 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.731 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.731 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.731 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.732 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.732 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.732 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.732 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.733 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.733 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.733 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.734 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.734 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.734 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.734 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.734 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.735 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.735 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.735 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.736 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.736 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.736 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.736 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.737 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.737 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.737 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.737 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.738 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.738 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.738 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.738 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.739 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.739 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.739 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.739 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.740 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.740 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.740 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.741 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.741 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.741 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.741 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.741 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.742 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.742 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.742 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.743 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.743 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.743 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.743 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.744 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.744 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.744 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.744 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.745 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.745 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.745 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.745 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.746 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.746 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.746 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.746 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.747 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.747 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.747 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.747 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.748 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.748 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.748 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.748 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.749 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.749 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.749 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.749 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.750 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.750 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.750 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.750 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.750 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.751 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.751 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.751 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.751 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.752 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.752 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.753 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.753 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.753 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.753 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.754 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.754 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.754 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.755 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.755 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.755 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.756 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.756 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.756 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.756 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.757 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.757 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.757 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.757 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.758 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.758 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.758 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.758 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.758 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.759 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.759 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.759 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.760 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.760 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.760 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.760 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.761 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.761 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.761 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.761 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.762 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.762 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.762 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.762 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.763 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.763 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.763 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.763 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.764 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.764 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.764 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.765 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.765 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.765 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.765 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.766 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.766 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.766 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.766 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.767 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.767 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.767 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.767 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.768 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.768 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.768 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.768 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.769 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.769 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.769 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.769 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.770 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.770 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.770 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.770 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.771 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.771 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.771 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.771 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.772 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.772 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.772 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.772 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.773 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.773 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.773 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.774 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.774 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.774 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.774 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.775 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.775 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.776 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.776 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.776 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.776 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.777 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.777 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.777 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.778 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.778 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.778 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.778 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.779 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.779 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.779 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.779 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.780 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.780 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.780 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.780 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.781 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.781 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.781 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.781 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.782 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.782 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.782 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.782 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.783 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.783 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.783 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.784 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.784 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.784 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.784 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.785 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.785 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.785 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.786 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.786 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.786 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.786 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.787 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.787 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.787 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.787 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.788 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.788 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.788 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.788 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.789 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.789 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.789 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.789 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.790 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.790 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.790 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.790 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.791 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.791 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.791 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.791 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.792 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.792 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.792 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.792 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.793 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.793 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.793 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.793 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.794 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.794 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.794 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.794 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.795 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.795 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.795 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.795 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.796 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.796 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.796 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.796 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.797 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.797 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.797 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.797 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.798 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.798 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.798 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.798 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.799 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.799 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.799 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.799 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.800 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.800 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.800 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.801 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.801 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.801 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.801 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.802 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.802 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.802 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.802 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.803 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.803 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.803 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.804 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.804 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.804 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.804 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.805 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.805 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.805 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.805 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.805 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.806 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.806 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.806 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.806 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.807 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.807 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.807 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.807 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.808 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.808 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.808 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.808 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.809 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.809 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.809 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.809 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.810 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.810 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.810 228138 WARNING oslo_config.cfg [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 05:09:37 np0005540826 nova_compute[228134]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 05:09:37 np0005540826 nova_compute[228134]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 05:09:37 np0005540826 nova_compute[228134]: and ``live_migration_inbound_addr`` respectively.
Dec  1 05:09:37 np0005540826 nova_compute[228134]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.810 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.811 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.811 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.811 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.812 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.812 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.812 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.812 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.813 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.813 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.813 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.813 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.814 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.814 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.814 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.814 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.815 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.815 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.815 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.815 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.816 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.816 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.816 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.816 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.817 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.817 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.817 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.817 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.818 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.818 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.818 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.819 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.819 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.819 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.819 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.820 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.820 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.820 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.820 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.821 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.821 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.821 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.822 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.822 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.822 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.822 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.823 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.823 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.823 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.823 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.824 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.824 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.824 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.824 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.825 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.825 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.825 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.825 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.826 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.826 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.826 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.826 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.827 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.827 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.827 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.828 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.828 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.828 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.828 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.829 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.829 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.829 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.829 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.829 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.830 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.830 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.830 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.830 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.831 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.831 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.831 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.832 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.832 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.832 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.832 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.833 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.833 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.833 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.833 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.834 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.834 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.834 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.834 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.835 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.835 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.835 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.835 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.836 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.836 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.836 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.837 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.837 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.837 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.837 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.838 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.838 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.838 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.838 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.839 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.839 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.839 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.839 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.840 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.840 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.840 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.840 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.841 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.841 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.841 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.841 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.842 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.842 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.842 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.843 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.843 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.843 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.843 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.844 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.844 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.844 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.844 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.845 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.845 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.845 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.845 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.846 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.846 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.846 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.847 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.847 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.847 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.847 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.848 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.848 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.848 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.849 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.849 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.849 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.849 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.850 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.850 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.850 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.850 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.851 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.851 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.851 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.851 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.852 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.852 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.852 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.852 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.853 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.853 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.853 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.853 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.854 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.854 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.854 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.855 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.855 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.855 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.856 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.856 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.856 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.856 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.857 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.857 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.857 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.858 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.858 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.858 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.858 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.859 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.859 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.859 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.859 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.860 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.860 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.860 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.861 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.861 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.861 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.861 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.862 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.862 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.862 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.862 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.863 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.863 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.863 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.863 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.864 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.864 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.864 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.864 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.865 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.865 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.865 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.865 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.866 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.866 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.866 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.867 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.867 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.867 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.867 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.868 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.868 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.868 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.868 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.869 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.869 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.869 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.869 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.870 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.870 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.870 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.870 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.871 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.871 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.871 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.871 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.871 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.872 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.872 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.872 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.872 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.873 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.873 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.873 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.873 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.874 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.874 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.874 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.874 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.875 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.875 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.875 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.875 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.876 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.876 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.876 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.877 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.877 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.877 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.877 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.877 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.878 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.878 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.878 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.878 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.879 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.879 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.879 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.879 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.880 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 python3.9[228601]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.880 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.881 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.881 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.881 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.882 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.882 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.882 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.882 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.883 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.883 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.883 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.883 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.884 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.884 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.884 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.884 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.885 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.885 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.885 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.885 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.886 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.886 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.886 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.886 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.887 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.887 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.887 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.888 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.888 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.888 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.888 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.889 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.889 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.889 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.889 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.890 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.890 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.890 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.890 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.891 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.891 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.891 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.891 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.892 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.892 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.892 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.892 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.892 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.893 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.893 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.893 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.893 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.894 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.894 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.894 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.894 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.895 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.895 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.895 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.895 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.896 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.896 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.896 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.896 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.897 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.897 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.897 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.898 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.898 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.898 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.898 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.899 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.899 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.899 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.900 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.900 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.900 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.901 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.901 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.901 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.901 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.902 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.902 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.902 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.902 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.903 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.903 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.903 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.903 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.904 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.904 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.904 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.904 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.905 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.905 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.905 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.905 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.906 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.906 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.906 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.906 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.907 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.907 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.907 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.907 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.908 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.908 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.908 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.909 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.909 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.909 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.909 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.910 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.910 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.910 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.911 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.911 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.911 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.912 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.912 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.912 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.912 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.913 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.913 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.913 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.914 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.914 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.914 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.914 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.915 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.915 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.915 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.915 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.916 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.916 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.916 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.916 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.917 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.917 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.917 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.918 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.918 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.918 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.918 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.918 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.919 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.919 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.919 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.919 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.920 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.920 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.920 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.920 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.921 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.921 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.921 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.921 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.921 228138 DEBUG oslo_service.service [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:09:37 np0005540826 nova_compute[228134]: 2025-12-01 10:09:37.923 228138 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 05:09:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:37.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.094 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.095 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.096 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.096 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 05:09:38 np0005540826 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 05:09:38 np0005540826 systemd[1]: Started libvirt QEMU daemon.
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.176 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f51ca1e96d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.179 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f51ca1e96d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.180 228138 INFO nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.283 228138 WARNING nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  1 05:09:38 np0005540826 nova_compute[228134]: 2025-12-01 10:09:38.285 228138 DEBUG nova.virt.libvirt.volume.mount [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  1 05:09:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:38.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:38 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.004 228138 INFO nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <host>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <uuid>bbc1bf3e-9c61-4776-b766-63a97b665391</uuid>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <arch>x86_64</arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <microcode version='16777317'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <signature family='23' model='49' stepping='0'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='x2apic'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='tsc-deadline'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='osxsave'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='hypervisor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='tsc_adjust'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='spec-ctrl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='stibp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='arch-capabilities'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='cmp_legacy'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='topoext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='virt-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='lbrv'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='tsc-scale'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='vmcb-clean'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='pause-filter'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='pfthreshold'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='rdctl-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='mds-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature name='pschange-mc-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <pages unit='KiB' size='4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <pages unit='KiB' size='2048'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <pages unit='KiB' size='1048576'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <power_management>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <suspend_mem/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </power_management>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <iommu support='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <migration_features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <live/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <uri_transports>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <uri_transport>tcp</uri_transport>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <uri_transport>rdma</uri_transport>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </uri_transports>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </migration_features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <topology>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <cells num='1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <cell id='0'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <memory unit='KiB'>7864324</memory>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <pages unit='KiB' size='4'>1966081</pages>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <distances>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <sibling id='0' value='10'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          </distances>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          <cpus num='8'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:          </cpus>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        </cell>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </cells>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </topology>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <cache>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </cache>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <secmodel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model>selinux</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <doi>0</doi>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </secmodel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <secmodel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model>dac</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <doi>0</doi>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </secmodel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </host>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <guest>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <os_type>hvm</os_type>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <arch name='i686'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <wordsize>32</wordsize>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <domain type='qemu'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <domain type='kvm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <pae/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <nonpae/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <apic default='on' toggle='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <cpuselection/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <deviceboot/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <externalSnapshot/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </guest>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <guest>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <os_type>hvm</os_type>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <arch name='x86_64'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <wordsize>64</wordsize>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <domain type='qemu'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <domain type='kvm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <apic default='on' toggle='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <cpuselection/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <deviceboot/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <externalSnapshot/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </guest>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 
Dec  1 05:09:39 np0005540826 nova_compute[228134]: </capabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: #033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.011 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.033 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 05:09:39 np0005540826 nova_compute[228134]: <domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <arch>i686</arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <vcpu max='240'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <os supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>rom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pflash</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>yes</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='secure'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </loader>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </os>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>memfd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </memoryBacking>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>disk</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>floppy</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>lun</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ide</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>fdc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>sata</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </disk>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vnc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </graphics>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <video supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vga</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>none</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>bochs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </video>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='mode'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>requisite</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>optional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pci</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hostdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>random</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </rng>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>path</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>handle</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </filesystem>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emulator</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>external</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>2.0</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </tpm>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </redirdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </channel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </crypto>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>passt</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </interface>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>isa</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </panic>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <console supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>null</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dev</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pipe</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stdio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>udp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tcp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </console>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='features'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vapic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>runtime</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>synic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stimer</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reset</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ipi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>avic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hyperv>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tdx</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </launchSecurity>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: </domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.039 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 05:09:39 np0005540826 nova_compute[228134]: <domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <arch>i686</arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <vcpu max='4096'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <os supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>rom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pflash</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>yes</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='secure'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </loader>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </os>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 python3.9[228816]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>memfd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </memoryBacking>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>disk</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>floppy</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>lun</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>fdc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>sata</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </disk>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vnc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </graphics>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <video supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vga</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>none</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>bochs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </video>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='mode'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>requisite</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>optional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pci</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hostdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>random</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </rng>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>path</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>handle</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </filesystem>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emulator</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>external</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>2.0</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </tpm>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </redirdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </channel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </crypto>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>passt</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </interface>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>isa</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </panic>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <console supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>null</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dev</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pipe</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stdio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>udp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tcp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </console>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='features'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vapic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>runtime</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>synic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stimer</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reset</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ipi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>avic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hyperv>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tdx</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </launchSecurity>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: </domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.065 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.069 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 05:09:39 np0005540826 nova_compute[228134]: <domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <arch>x86_64</arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <vcpu max='240'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <os supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>rom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pflash</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>yes</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='secure'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </loader>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </os>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>memfd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </memoryBacking>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>disk</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>floppy</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>lun</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ide</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>fdc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>sata</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </disk>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vnc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </graphics>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <video supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vga</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>none</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>bochs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </video>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='mode'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>requisite</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>optional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pci</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hostdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>random</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </rng>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>path</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>handle</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </filesystem>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emulator</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>external</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>2.0</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </tpm>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </redirdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </channel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </crypto>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>passt</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </interface>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>isa</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </panic>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <console supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>null</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dev</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pipe</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stdio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>udp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tcp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </console>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='features'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vapic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>runtime</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>synic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stimer</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reset</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ipi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>avic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hyperv>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tdx</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </launchSecurity>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: </domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.137 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  1 05:09:39 np0005540826 nova_compute[228134]: <domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <arch>x86_64</arch>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <vcpu max='4096'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <os supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='firmware'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>efi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>rom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pflash</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>yes</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='secure'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>yes</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>no</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </loader>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </os>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>on</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>off</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </blockers>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </mode>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </cpu>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <value>memfd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </memoryBacking>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>disk</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>floppy</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>lun</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>fdc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>sata</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </disk>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vnc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </graphics>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <video supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vga</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>none</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>bochs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </video>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='mode'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>requisite</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>optional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pci</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>scsi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hostdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>random</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>egd</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </rng>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>path</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>handle</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </filesystem>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emulator</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>external</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>2.0</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </tpm>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='bus'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>usb</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </redirdev>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </channel>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>builtin</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </crypto>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>default</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>passt</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </interface>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='model'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>isa</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </panic>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <console supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='type'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>null</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vc</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pty</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dev</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>file</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>pipe</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stdio</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>udp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tcp</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>unix</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>dbus</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </console>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </devices>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  <features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='features'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vapic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>runtime</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>synic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>stimer</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reset</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>ipi</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>avic</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </defaults>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </hyperv>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:        <value>tdx</value>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:      </enum>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:    </launchSecurity>
Dec  1 05:09:39 np0005540826 nova_compute[228134]:  </features>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: </domainCapabilities>
Dec  1 05:09:39 np0005540826 nova_compute[228134]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.206 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.206 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.206 228138 DEBUG nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.206 228138 INFO nova.virt.libvirt.host [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Secure Boot support detected#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.209 228138 INFO nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.209 228138 INFO nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.220 228138 DEBUG nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.251 228138 INFO nova.virt.node [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Determined node identity 19014d04-db84-4f3d-831b-084720e9168c from /var/lib/nova/compute_id#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.270 228138 WARNING nova.compute.manager [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Compute nodes ['19014d04-db84-4f3d-831b-084720e9168c'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.305 228138 INFO nova.compute.manager [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.352 228138 WARNING nova.compute.manager [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.352 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.353 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.353 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.353 228138 DEBUG nova.compute.resource_tracker [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.353 228138 DEBUG oslo_concurrency.processutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:39 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:39 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:39 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:09:39 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3776919133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:09:39 np0005540826 nova_compute[228134]: 2025-12-01 10:09:39.857 228138 DEBUG oslo_concurrency.processutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:39 np0005540826 podman[228989]: 2025-12-01 10:09:39.859046187 +0000 UTC m=+0.060690468 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  1 05:09:39 np0005540826 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 05:09:39 np0005540826 systemd[1]: Started libvirt nodedev daemon.
Dec  1 05:09:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:09:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:40.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.149 228138 WARNING nova.virt.libvirt.driver [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.151 228138 DEBUG nova.compute.resource_tracker [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5292MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.151 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.151 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:40 np0005540826 python3.9[229035]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.166 228138 WARNING nova.compute.resource_tracker [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] No compute node record for compute-1.ctlplane.example.com:19014d04-db84-4f3d-831b-084720e9168c: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 19014d04-db84-4f3d-831b-084720e9168c could not be found.#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.190 228138 INFO nova.compute.resource_tracker [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 19014d04-db84-4f3d-831b-084720e9168c#033[00m
Dec  1 05:09:40 np0005540826 systemd[1]: Stopping nova_compute container...
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.254 228138 DEBUG oslo_concurrency.lockutils [None req-9e577fe9-d1dc-499e-982e-6876ee66b7a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.255 228138 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : b4b428b7b37a4532bae219ef9488ea80#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.255 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.255 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:40 np0005540826 nova_compute[228134]: 2025-12-01 10:09:40.256 228138 DEBUG oslo_concurrency.lockutils [None req-4fac2257-d00b-4533-9fce-e55bcf495a58 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:09:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:40.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:09:40 np0005540826 virtqemud[228647]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  1 05:09:40 np0005540826 virtqemud[228647]: hostname: compute-1
Dec  1 05:09:40 np0005540826 virtqemud[228647]: End of file while reading data: Input/output error
Dec  1 05:09:40 np0005540826 systemd[1]: libpod-cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53.scope: Deactivated successfully.
Dec  1 05:09:40 np0005540826 systemd[1]: libpod-cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53.scope: Consumed 3.951s CPU time.
Dec  1 05:09:40 np0005540826 podman[229065]: 2025-12-01 10:09:40.715636285 +0000 UTC m=+0.496193589 container died cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  1 05:09:40 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53-userdata-shm.mount: Deactivated successfully.
Dec  1 05:09:40 np0005540826 systemd[1]: var-lib-containers-storage-overlay-87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098-merged.mount: Deactivated successfully.
Dec  1 05:09:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:40 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:41 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:41 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:42.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:42 np0005540826 podman[229065]: 2025-12-01 10:09:42.472621445 +0000 UTC m=+2.253178739 container cleanup cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:09:42 np0005540826 podman[229065]: nova_compute
Dec  1 05:09:42 np0005540826 podman[229121]: nova_compute
Dec  1 05:09:42 np0005540826 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  1 05:09:42 np0005540826 systemd[1]: Stopped nova_compute container.
Dec  1 05:09:42 np0005540826 systemd[1]: Starting nova_compute container...
Dec  1 05:09:42 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:09:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87b9964f8628ce9ad78dbc8a658c7c252617445a57c548122b5037d37739e098/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:42 np0005540826 podman[229134]: 2025-12-01 10:09:42.668603144 +0000 UTC m=+0.090720475 container init cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec  1 05:09:42 np0005540826 podman[229134]: 2025-12-01 10:09:42.676181218 +0000 UTC m=+0.098298519 container start cfb7cbc57641f129e88c32410827d833c12a2a569b276caeff50c1efc617ea53 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:09:42 np0005540826 podman[229134]: nova_compute
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + sudo -E kolla_set_configs
Dec  1 05:09:42 np0005540826 systemd[1]: Started nova_compute container.
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Validating config file
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying service configuration files
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /etc/ceph
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Creating directory /etc/ceph
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Writing out command to execute
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:42 np0005540826 nova_compute[229148]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:42 np0005540826 nova_compute[229148]: ++ cat /run_command
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + CMD=nova-compute
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + ARGS=
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + sudo kolla_copy_cacerts
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + [[ ! -n '' ]]
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + . kolla_extend_start
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 05:09:42 np0005540826 nova_compute[229148]: Running command: 'nova-compute'
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + umask 0022
Dec  1 05:09:42 np0005540826 nova_compute[229148]: + exec nova-compute
Dec  1 05:09:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:42 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:43 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:43 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:43 np0005540826 python3.9[229311]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 05:09:43 np0005540826 systemd[1]: Started libpod-conmon-a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7.scope.
Dec  1 05:09:43 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:09:44 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdef5e97655acb8305c4c193dd2593870c2bf532f7d2d12c5df270168924875a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:44 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdef5e97655acb8305c4c193dd2593870c2bf532f7d2d12c5df270168924875a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:44.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:44 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdef5e97655acb8305c4c193dd2593870c2bf532f7d2d12c5df270168924875a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:44 np0005540826 podman[229337]: 2025-12-01 10:09:44.029655625 +0000 UTC m=+0.154338578 container init a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:09:44 np0005540826 podman[229337]: 2025-12-01 10:09:44.036157611 +0000 UTC m=+0.160840554 container start a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:09:44 np0005540826 python3.9[229311]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Applying nova statedir ownership
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  1 05:09:44 np0005540826 nova_compute_init[229358]: INFO:nova_statedir:Nova statedir ownership complete
Dec  1 05:09:44 np0005540826 systemd[1]: libpod-a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540826 podman[229371]: 2025-12-01 10:09:44.147004198 +0000 UTC m=+0.033127876 container died a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm)
Dec  1 05:09:44 np0005540826 systemd[1]: var-lib-containers-storage-overlay-bdef5e97655acb8305c4c193dd2593870c2bf532f7d2d12c5df270168924875a-merged.mount: Deactivated successfully.
Dec  1 05:09:44 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7-userdata-shm.mount: Deactivated successfully.
Dec  1 05:09:44 np0005540826 podman[229371]: 2025-12-01 10:09:44.187278806 +0000 UTC m=+0.073402464 container cleanup a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=nova_compute_init)
Dec  1 05:09:44 np0005540826 systemd[1]: libpod-conmon-a4dfe4fdcf15ffd724c87d934a9b23b6bc3b1013a2f41906449d7aa6a11f07b7.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:44.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:44 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:44 np0005540826 nova_compute[229148]: 2025-12-01 10:09:44.917 229152 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540826 nova_compute[229148]: 2025-12-01 10:09:44.918 229152 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540826 nova_compute[229148]: 2025-12-01 10:09:44.918 229152 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540826 nova_compute[229148]: 2025-12-01 10:09:44.918 229152 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 05:09:44 np0005540826 systemd-logind[787]: Session 53 logged out. Waiting for processes to exit.
Dec  1 05:09:44 np0005540826 systemd[1]: session-53.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540826 systemd[1]: session-53.scope: Consumed 2min 24.339s CPU time.
Dec  1 05:09:44 np0005540826 systemd-logind[787]: Removed session 53.
Dec  1 05:09:45 np0005540826 nova_compute[229148]: 2025-12-01 10:09:45.072 229152 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:45 np0005540826 nova_compute[229148]: 2025-12-01 10:09:45.097 229152 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:45 np0005540826 nova_compute[229148]: 2025-12-01 10:09:45.097 229152 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 05:09:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:45 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/100945 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:09:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:45 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:46.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.168 229152 INFO nova.virt.driver [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.280 229152 INFO nova.compute.provider_config [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.290 229152 DEBUG oslo_concurrency.lockutils [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.291 229152 DEBUG oslo_concurrency.lockutils [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.291 229152 DEBUG oslo_concurrency.lockutils [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.291 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.291 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.292 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.292 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.292 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.292 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.292 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.293 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.293 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.293 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.293 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.293 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.294 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.294 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.294 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.294 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.294 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.295 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.296 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.296 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.296 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.296 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.297 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.297 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.297 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.297 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.297 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.298 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.298 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.298 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.298 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.298 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.299 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.299 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.299 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.299 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.299 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.300 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.300 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.300 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.300 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.300 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.301 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.302 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.302 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.302 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.302 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.302 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.303 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.304 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.305 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.305 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.305 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.305 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.305 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.306 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.306 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.306 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.306 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.306 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.307 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.308 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.308 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.308 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.308 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.308 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.309 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.310 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.310 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.310 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.310 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.310 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.311 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.312 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.312 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.312 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.312 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.312 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.313 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.314 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.314 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.314 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.314 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.314 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.315 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.316 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.317 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.317 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.317 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.317 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.317 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.318 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.319 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.319 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.319 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.319 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.319 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.320 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.320 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.320 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.320 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.320 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.321 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.321 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.321 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.321 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.321 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.322 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.323 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.323 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.323 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.323 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.323 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.324 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.324 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.324 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.324 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.324 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.325 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.326 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.326 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.326 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.326 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.326 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.327 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.327 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.327 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.327 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.327 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.328 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.329 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.329 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.329 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.329 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.329 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.330 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.331 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.331 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.331 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.331 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.331 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.332 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.332 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.332 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.332 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.332 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.333 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.334 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.334 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.334 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.334 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.334 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.335 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.336 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.336 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.336 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.336 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.336 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.337 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.338 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.338 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.338 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.338 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.338 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.339 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.340 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.340 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.340 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.340 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.340 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.341 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.341 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.341 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.341 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.341 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.342 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.343 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.343 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.343 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.343 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.343 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.344 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.344 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.344 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.344 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.344 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.345 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.346 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.346 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.346 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.346 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.346 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.347 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.347 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.347 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.347 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.347 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.348 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.348 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.348 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.348 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.348 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.349 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.350 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.350 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.350 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.350 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.350 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.351 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.352 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.352 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.352 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.352 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.352 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.353 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.354 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.354 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.354 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.354 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.354 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.355 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.355 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.355 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.355 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.355 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.356 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.356 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.356 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.356 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.356 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.357 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.357 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.357 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.357 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.357 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.358 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.358 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.358 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.358 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.358 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.359 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.359 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.359 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.359 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.359 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.360 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.361 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.362 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.362 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.362 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.362 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.363 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.364 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.364 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.364 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.364 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.364 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.365 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.365 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.365 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.365 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.365 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.366 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.366 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.366 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.366 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.366 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.367 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.367 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.367 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.367 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.367 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.368 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.368 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.368 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.368 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.368 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.369 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.370 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.370 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.370 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.370 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.370 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.371 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.371 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.371 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.371 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.371 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.372 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.373 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.373 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.373 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.373 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.373 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.374 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.375 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.375 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.375 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.375 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.375 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.376 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.376 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.376 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.376 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.376 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.377 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.378 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.378 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.378 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.378 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.378 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.379 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.380 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.380 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.380 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.380 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.380 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.381 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.381 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.381 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.381 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.381 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.382 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.383 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.383 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.383 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.383 229152 WARNING oslo_config.cfg [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 05:09:46 np0005540826 nova_compute[229148]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 05:09:46 np0005540826 nova_compute[229148]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 05:09:46 np0005540826 nova_compute[229148]: and ``live_migration_inbound_addr`` respectively.
Dec  1 05:09:46 np0005540826 nova_compute[229148]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.384 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.384 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.384 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.384 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.384 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.385 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.386 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.386 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.386 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.386 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.386 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.387 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.388 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.388 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.388 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.388 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.388 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.389 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.389 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.389 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.389 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.389 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.390 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.391 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.391 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.391 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.391 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.391 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.392 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.392 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.392 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.392 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.392 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.393 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.393 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.393 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.393 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.393 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.394 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.395 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.395 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.395 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.395 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.395 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.396 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.396 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.396 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.396 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.396 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.397 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.398 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.398 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.398 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.398 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.398 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.399 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.399 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.399 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.399 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.399 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.400 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.401 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.402 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.403 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.404 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.405 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.406 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.407 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.408 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.408 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.408 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.408 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.408 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.409 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.410 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.411 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.412 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.413 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.414 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.415 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.415 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.415 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.415 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.415 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.416 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.417 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.417 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.417 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.417 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.417 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.418 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.419 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.420 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.421 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.422 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.423 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.424 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.425 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.426 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.426 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.426 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.426 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.426 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.427 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.428 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.428 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.428 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.428 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.429 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.430 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.430 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.430 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.430 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.430 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.431 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.431 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.431 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.431 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.431 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.432 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.433 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.434 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.434 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.434 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.434 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.434 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.435 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.436 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.436 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.436 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.436 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.436 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.437 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.437 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.437 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.437 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.437 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.438 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.439 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.439 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.439 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.439 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.439 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.440 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.441 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.442 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.442 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.442 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.442 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.442 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.443 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.443 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.443 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.443 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.443 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.444 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.445 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.446 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.447 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.448 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.449 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.450 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.451 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.452 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.453 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.454 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.455 229152 DEBUG oslo_service.service [None req-94bceff6-fedd-401d-8d38-62562dcf702c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.456 229152 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.470 229152 INFO nova.virt.node [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Determined node identity 19014d04-db84-4f3d-831b-084720e9168c from /var/lib/nova/compute_id#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.470 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.471 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.471 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.472 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.483 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe50c376280> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.485 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe50c376280> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.486 229152 INFO nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.493 229152 INFO nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <host>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <uuid>bbc1bf3e-9c61-4776-b766-63a97b665391</uuid>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <arch>x86_64</arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <microcode version='16777317'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <signature family='23' model='49' stepping='0'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='x2apic'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='tsc-deadline'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='osxsave'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='hypervisor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='tsc_adjust'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='spec-ctrl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='stibp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='arch-capabilities'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='cmp_legacy'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='topoext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='virt-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='lbrv'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='tsc-scale'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='vmcb-clean'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='pause-filter'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='pfthreshold'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='rdctl-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='mds-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature name='pschange-mc-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <pages unit='KiB' size='4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <pages unit='KiB' size='2048'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <pages unit='KiB' size='1048576'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <power_management>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <suspend_mem/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </power_management>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <iommu support='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <migration_features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <live/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <uri_transports>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <uri_transport>tcp</uri_transport>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <uri_transport>rdma</uri_transport>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </uri_transports>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </migration_features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <topology>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <cells num='1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <cell id='0'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <memory unit='KiB'>7864324</memory>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <pages unit='KiB' size='4'>1966081</pages>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <distances>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <sibling id='0' value='10'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          </distances>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          <cpus num='8'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:          </cpus>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        </cell>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </cells>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </topology>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <cache>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </cache>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <secmodel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model>selinux</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <doi>0</doi>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </secmodel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <secmodel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model>dac</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <doi>0</doi>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </secmodel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </host>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <guest>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <os_type>hvm</os_type>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <arch name='i686'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <wordsize>32</wordsize>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <domain type='qemu'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <domain type='kvm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <pae/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <nonpae/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <apic default='on' toggle='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <cpuselection/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <deviceboot/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <externalSnapshot/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </guest>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <guest>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <os_type>hvm</os_type>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <arch name='x86_64'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <wordsize>64</wordsize>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <domain type='qemu'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <domain type='kvm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <apic default='on' toggle='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <cpuselection/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <deviceboot/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <externalSnapshot/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </guest>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 
Dec  1 05:09:46 np0005540826 nova_compute[229148]: </capabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: #033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.498 229152 DEBUG nova.virt.libvirt.volume.mount [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.500 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.503 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 05:09:46 np0005540826 nova_compute[229148]: <domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <arch>i686</arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <vcpu max='4096'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <os supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>rom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pflash</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>yes</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='secure'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </loader>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>memfd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </memoryBacking>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>disk</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>floppy</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>lun</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>fdc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>sata</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vnc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <video supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vga</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>none</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>bochs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='mode'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>requisite</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>optional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pci</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hostdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>random</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>path</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>handle</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </filesystem>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emulator</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>external</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>2.0</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </tpm>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </redirdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </channel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </crypto>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>passt</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>isa</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </panic>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <console supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>null</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dev</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pipe</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stdio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>udp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tcp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='features'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vapic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>runtime</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>synic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stimer</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reset</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ipi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>avic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hyperv>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tdx</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </launchSecurity>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: </domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.508 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 05:09:46 np0005540826 nova_compute[229148]: <domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <arch>i686</arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <vcpu max='240'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <os supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>rom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pflash</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>yes</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='secure'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </loader>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>memfd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </memoryBacking>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>disk</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>floppy</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>lun</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ide</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>fdc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>sata</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vnc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <video supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vga</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>none</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>bochs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='mode'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>requisite</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>optional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pci</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hostdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>random</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>path</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>handle</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </filesystem>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emulator</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>external</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>2.0</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </tpm>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </redirdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </channel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </crypto>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>passt</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>isa</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </panic>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <console supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>null</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dev</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pipe</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stdio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>udp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tcp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='features'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vapic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>runtime</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>synic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stimer</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reset</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ipi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>avic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hyperv>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tdx</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </launchSecurity>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: </domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.543 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.549 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  1 05:09:46 np0005540826 nova_compute[229148]: <domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <arch>x86_64</arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <vcpu max='4096'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <os supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='firmware'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>efi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>rom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pflash</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>yes</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='secure'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>yes</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </loader>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:46.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>memfd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </memoryBacking>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>disk</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>floppy</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>lun</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>fdc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>sata</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vnc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <video supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vga</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>none</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>bochs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='mode'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>requisite</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>optional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pci</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hostdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>random</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>path</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>handle</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </filesystem>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emulator</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>external</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>2.0</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </tpm>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </redirdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </channel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </crypto>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>passt</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>isa</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </panic>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <console supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>null</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dev</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pipe</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stdio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>udp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tcp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='features'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vapic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>runtime</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>synic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stimer</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reset</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ipi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>avic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hyperv>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tdx</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </launchSecurity>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: </domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.613 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 05:09:46 np0005540826 nova_compute[229148]: <domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <arch>x86_64</arch>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <vcpu max='240'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <os supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>rom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pflash</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>yes</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='secure'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>no</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </loader>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>on</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>off</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </blockers>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </mode>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <value>memfd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </memoryBacking>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>disk</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>floppy</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>lun</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ide</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>fdc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>sata</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vnc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <video supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vga</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>none</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>bochs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='mode'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>requisite</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>optional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pci</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>scsi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hostdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>random</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>egd</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>path</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>handle</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </filesystem>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emulator</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>external</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>2.0</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </tpm>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='bus'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>usb</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </redirdev>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </channel>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>builtin</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </crypto>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>default</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>passt</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='model'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>isa</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </panic>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <console supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='type'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>null</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vc</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pty</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dev</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>file</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>pipe</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stdio</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>udp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tcp</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>unix</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>dbus</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='features'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vapic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>runtime</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>synic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>stimer</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reset</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>ipi</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>avic</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </defaults>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </hyperv>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:        <value>tdx</value>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:      </enum>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:    </launchSecurity>
Dec  1 05:09:46 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: </domainCapabilities>
Dec  1 05:09:46 np0005540826 nova_compute[229148]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.685 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.685 229152 INFO nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Secure Boot support detected#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.689 229152 INFO nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.689 229152 INFO nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.699 229152 DEBUG nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.719 229152 INFO nova.virt.node [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Determined node identity 19014d04-db84-4f3d-831b-084720e9168c from /var/lib/nova/compute_id#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.739 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Verified node 19014d04-db84-4f3d-831b-084720e9168c matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Dec  1 05:09:46 np0005540826 nova_compute[229148]: 2025-12-01 10:09:46.782 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  1 05:09:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:46 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:47 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.465 229152 ERROR nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Could not retrieve compute node resource provider 19014d04-db84-4f3d-831b-084720e9168c and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 19014d04-db84-4f3d-831b-084720e9168c: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '19014d04-db84-4f3d-831b-084720e9168c' not found: No resource provider with uuid 19014d04-db84-4f3d-831b-084720e9168c found  ", "request_id": "req-59da022d-1e9b-4402-a2fc-bf81738ae165"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 19014d04-db84-4f3d-831b-084720e9168c: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '19014d04-db84-4f3d-831b-084720e9168c' not found: No resource provider with uuid 19014d04-db84-4f3d-831b-084720e9168c found  ", "request_id": "req-59da022d-1e9b-4402-a2fc-bf81738ae165"}]}#033[00m
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.493 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.493 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.493 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.493 229152 DEBUG nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.494 229152 DEBUG oslo_concurrency.processutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:47 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:47 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:09:47 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2407314549' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:09:47 np0005540826 nova_compute[229148]: 2025-12-01 10:09:47.967 229152 DEBUG oslo_concurrency.processutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.236 229152 WARNING nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.237 229152 DEBUG nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5293MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.237 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.238 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.480 229152 ERROR nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 19014d04-db84-4f3d-831b-084720e9168c: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '19014d04-db84-4f3d-831b-084720e9168c' not found: No resource provider with uuid 19014d04-db84-4f3d-831b-084720e9168c found  ", "request_id": "req-3b50214c-5256-4438-b65c-48cca0f8885b"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 19014d04-db84-4f3d-831b-084720e9168c: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '19014d04-db84-4f3d-831b-084720e9168c' not found: No resource provider with uuid 19014d04-db84-4f3d-831b-084720e9168c found  ", "request_id": "req-3b50214c-5256-4438-b65c-48cca0f8885b"}]}#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.481 229152 DEBUG nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.481 229152 DEBUG nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.642 229152 INFO nova.scheduler.client.report [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [req-32df767b-7db3-4b6e-81ca-f835394788b7] Created resource provider record via placement API for resource provider with UUID 19014d04-db84-4f3d-831b-084720e9168c and name compute-1.ctlplane.example.com.#033[00m
Dec  1 05:09:48 np0005540826 nova_compute[229148]: 2025-12-01 10:09:48.659 229152 DEBUG oslo_concurrency.processutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 05:09:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:48.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 05:09:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:48 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:09:49 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2987823263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.092 229152 DEBUG oslo_concurrency.processutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.098 229152 DEBUG nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec  1 05:09:49 np0005540826 nova_compute[229148]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.099 229152 INFO nova.virt.libvirt.host [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.100 229152 DEBUG nova.compute.provider_tree [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.100 229152 DEBUG nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.188 229152 DEBUG nova.scheduler.client.report [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Updated inventory for provider 19014d04-db84-4f3d-831b-084720e9168c with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.188 229152 DEBUG nova.compute.provider_tree [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Updating resource provider 19014d04-db84-4f3d-831b-084720e9168c generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.188 229152 DEBUG nova.compute.provider_tree [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.313 229152 DEBUG nova.compute.provider_tree [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Updating resource provider 19014d04-db84-4f3d-831b-084720e9168c generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.346 229152 DEBUG nova.compute.resource_tracker [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.346 229152 DEBUG oslo_concurrency.lockutils [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.346 229152 DEBUG nova.service [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.411 229152 DEBUG nova.service [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec  1 05:09:49 np0005540826 nova_compute[229148]: 2025-12-01 10:09:49.412 229152 DEBUG nova.servicegroup.drivers.db [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec  1 05:09:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:49 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:49 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:50.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:50 np0005540826 nova_compute[229148]: 2025-12-01 10:09:50.413 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:09:50 np0005540826 nova_compute[229148]: 2025-12-01 10:09:50.432 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:09:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:09:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:50.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:09:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:50 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:51 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:51 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:09:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:52.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:09:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:52.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:52 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:53 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:53 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:09:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:54.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:09:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:54 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:09:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 05:09:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:54.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 05:09:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:54 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:54 np0005540826 podman[229497]: 2025-12-01 10:09:54.984250576 +0000 UTC m=+0.068351013 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:09:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:55 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:55 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:56.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:56.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:56 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:09:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:09:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:57 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:09:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:58.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:09:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:09:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:09:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:58.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:09:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:58 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:59 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:09:59 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:00.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:00 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:10:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:00.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:00 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:00 np0005540826 ceph-mon[80026]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Dec  1 05:10:01 np0005540826 podman[229521]: 2025-12-01 10:10:01.016773641 +0000 UTC m=+0.087532897 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 05:10:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:01 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:01 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:10:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:02.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:10:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:10:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:02.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:10:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:02 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:03 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:04.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:10:04.540 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:10:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:10:04.541 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:10:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:10:04.541 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:10:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:04.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:04 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:05 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:05 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7928002810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:06.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:06.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:06 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7920004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:07 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f794c0043d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101007 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:10:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[218191]: 01/12/2025 10:10:07 : epoch 692d6915 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f791c003cb0 fd 48 proxy ignored for local
Dec  1 05:10:07 np0005540826 kernel: ganesha.nfsd[225362]: segfault at 50 ip 00007f79fdbae32e sp 00007f79cd7f9210 error 4 in libntirpc.so.5.8[7f79fdb93000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  1 05:10:07 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:10:07 np0005540826 systemd[1]: Started Process Core Dump (PID 229575/UID 0).
Dec  1 05:10:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:10:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:08.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:10:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 05:10:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:08.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 05:10:08 np0005540826 systemd-coredump[229576]: Process 218202 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007f79fdbae32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:10:08 np0005540826 systemd[1]: systemd-coredump@9-229575-0.service: Deactivated successfully.
Dec  1 05:10:08 np0005540826 systemd[1]: systemd-coredump@9-229575-0.service: Consumed 1.098s CPU time.
Dec  1 05:10:08 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:10:08 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:10:08 np0005540826 podman[229583]: 2025-12-01 10:10:08.877571302 +0000 UTC m=+0.034663733 container died d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 05:10:08 np0005540826 systemd[1]: var-lib-containers-storage-overlay-6f89c4fc98e4b5cea678a1e123388c7134f645c2ef19bfd26433be98a5b7eb5c-merged.mount: Deactivated successfully.
Dec  1 05:10:08 np0005540826 podman[229583]: 2025-12-01 10:10:08.916956392 +0000 UTC m=+0.074048803 container remove d9b417083696649e5f738a4bf9afe655ab9a3ed709a1ab75b9b8022d135049b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:10:08 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:10:09 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:10:09 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.495s CPU time.
Dec  1 05:10:09 np0005540826 podman[229627]: 2025-12-01 10:10:09.971841612 +0000 UTC m=+0.056312216 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec  1 05:10:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:10.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 05:10:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:10.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 05:10:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:12.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101012 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:10:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:12.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101013 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:10:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:14.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:14.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000013s ======
Dec  1 05:10:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:16.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:16.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1630024889' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:10:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1630024889' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:10:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:10:17 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/85141022' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:10:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:10:17 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/85141022' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:10:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:18.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:10:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:18.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:10:19 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 10.
Dec  1 05:10:19 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:10:19 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.495s CPU time.
Dec  1 05:10:19 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:10:19 np0005540826 podman[229782]: 2025-12-01 10:10:19.400706151 +0000 UTC m=+0.045129440 container create 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 05:10:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33e4e0fb9cbf7ea6c622b0713ccbb88d90c683be069a3991e8d1ca678a5f9da1/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33e4e0fb9cbf7ea6c622b0713ccbb88d90c683be069a3991e8d1ca678a5f9da1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33e4e0fb9cbf7ea6c622b0713ccbb88d90c683be069a3991e8d1ca678a5f9da1/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33e4e0fb9cbf7ea6c622b0713ccbb88d90c683be069a3991e8d1ca678a5f9da1/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540826 podman[229782]: 2025-12-01 10:10:19.454860311 +0000 UTC m=+0.099283610 container init 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  1 05:10:19 np0005540826 podman[229782]: 2025-12-01 10:10:19.460448548 +0000 UTC m=+0.104871837 container start 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:10:19 np0005540826 bash[229782]: 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0
Dec  1 05:10:19 np0005540826 podman[229782]: 2025-12-01 10:10:19.381447357 +0000 UTC m=+0.025870666 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:10:19 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:10:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:10:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:20.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:20.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:22.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:22.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:24.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:24.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:24 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:25 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:10:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:25 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:10:25 np0005540826 podman[229892]: 2025-12-01 10:10:25.980657779 +0000 UTC m=+0.064532816 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:10:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:26.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:26.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000012s ======
Dec  1 05:10:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:28.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000012s
Dec  1 05:10:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:28 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:10:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:28 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:10:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:28 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:10:28 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:28 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:10:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:28.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:10:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:30.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:10:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:30.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:10:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:31 : epoch 692d698b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:10:32 np0005540826 podman[229926]: 2025-12-01 10:10:32.03056976 +0000 UTC m=+0.112974457 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:10:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:32.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:32.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:32 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:33 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:33 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:10:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:34.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:10:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101034 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:10:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:34.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:34 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:34 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:35 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101035 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:10:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:35 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054000fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:36.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:36.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:36 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:36 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:37 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90480016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:37 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:38.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:38.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:38 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:38 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054001ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:39 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:39 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90480016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:40.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:40.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:40 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:40 np0005540826 podman[229961]: 2025-12-01 10:10:40.969122948 +0000 UTC m=+0.050750013 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:10:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:41 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054001ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:41 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:10:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:42.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:10:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:10:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:42.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:10:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:42 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90480016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:43 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:43 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054001ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:10:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:44.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:10:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:44.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:44 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.112 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.112 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.112 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:10:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:45 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:45 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.695 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.695 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.696 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.696 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.696 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.696 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.697 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.697 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.697 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.750 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.751 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.751 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.751 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:10:45 np0005540826 nova_compute[229148]: 2025-12-01 10:10:45.751 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:10:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:46.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:10:46 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3591721800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.206 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.362 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.364 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5321MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.364 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.365 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.471 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.471 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.498 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:10:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:46.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:46 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054002f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:10:46 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2882081013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.979 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:10:46 np0005540826 nova_compute[229148]: 2025-12-01 10:10:46.986 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:10:47 np0005540826 nova_compute[229148]: 2025-12-01 10:10:47.035 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:10:47 np0005540826 nova_compute[229148]: 2025-12-01 10:10:47.036 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:10:47 np0005540826 nova_compute[229148]: 2025-12-01 10:10:47.036 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:10:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:47 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:47 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:48.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:48.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:48 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:49 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054002f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:49 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054002f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:50.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:50.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:50 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:51 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:51 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:52.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:52.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:52 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:53 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:53 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:54.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:54.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:54 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:55 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:55 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:56.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:10:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:10:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:56 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:56 np0005540826 podman[230058]: 2025-12-01 10:10:56.979955684 +0000 UTC m=+0.064126421 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:10:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:57 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:57 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:58.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:10:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:10:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:10:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:58 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:59 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:10:59 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:00.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:00.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:00 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:01 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:01 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:02.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:11:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:02.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:11:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:02 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:03 np0005540826 podman[230108]: 2025-12-01 10:11:03.008065516 +0000 UTC m=+0.087893915 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec  1 05:11:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:03 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9054004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:03 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9048003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:04.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:11:04.541 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:11:04.542 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:11:04.542 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:04.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:04 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:05 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:05 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:06.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:06.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:06 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:07 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:07 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:08.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:08.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:08 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:09 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:09 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:10.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:10.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:10 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:11 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:11 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:11 np0005540826 podman[230143]: 2025-12-01 10:11:11.975908897 +0000 UTC m=+0.055481831 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:11:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:12.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:12.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:12 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:13 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:13 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:14.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:14.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:14 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:15 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:15 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:16.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:16.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:16 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:17 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:17 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:18.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:18.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:18 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:18 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:19 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:20.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:20.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:20 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  1 05:11:21 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3098294709' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  1 05:11:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:21 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f904c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:21 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f90600023f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:22.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101122 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:11:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:22.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:22 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:22 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9038002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[229797]: 01/12/2025 10:11:23 : epoch 692d698b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9070001340 fd 39 proxy ignored for local
Dec  1 05:11:23 np0005540826 kernel: ganesha.nfsd[230137]: segfault at 50 ip 00007f911bc9e32e sp 00007f90e77fd210 error 4 in libntirpc.so.5.8[7f911bc83000+2c000] likely on CPU 7 (core 0, socket 7)
Dec  1 05:11:23 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:11:23 np0005540826 systemd[1]: Started Process Core Dump (PID 230194/UID 0).
Dec  1 05:11:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:24.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:24 np0005540826 systemd-coredump[230195]: Process 229801 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007f911bc9e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:11:24 np0005540826 systemd[1]: systemd-coredump@10-230194-0.service: Deactivated successfully.
Dec  1 05:11:24 np0005540826 systemd[1]: systemd-coredump@10-230194-0.service: Consumed 1.198s CPU time.
Dec  1 05:11:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:11:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:24.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:11:24 np0005540826 podman[230282]: 2025-12-01 10:11:24.853327909 +0000 UTC m=+0.024518084 container died 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:11:24 np0005540826 systemd[1]: var-lib-containers-storage-overlay-33e4e0fb9cbf7ea6c622b0713ccbb88d90c683be069a3991e8d1ca678a5f9da1-merged.mount: Deactivated successfully.
Dec  1 05:11:25 np0005540826 podman[230282]: 2025-12-01 10:11:25.031511888 +0000 UTC m=+0.202702043 container remove 6059731c76b47d6540b1e9119e010b72aa25ab86f5d0a246429b25310dc5d1d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:11:25 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:11:25 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:11:25 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.588s CPU time.
Dec  1 05:11:25 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:11:25 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:25 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:25 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:11:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:26.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:26.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:27 np0005540826 podman[230326]: 2025-12-01 10:11:27.980501716 +0000 UTC m=+0.061660488 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:11:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:28.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:28.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101129 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:11:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:30.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:30.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:31 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:32.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:32.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:33 np0005540826 podman[230375]: 2025-12-01 10:11:33.999047574 +0000 UTC m=+0.083257817 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:11:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:34.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:34.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:35 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 11.
Dec  1 05:11:35 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:11:35 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.588s CPU time.
Dec  1 05:11:35 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:11:35 np0005540826 podman[230451]: 2025-12-01 10:11:35.62769829 +0000 UTC m=+0.047709274 container create 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:11:35 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f4e251abeed207ca6218b60c1f736e66593d371ba91812056310d68085171f4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:35 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f4e251abeed207ca6218b60c1f736e66593d371ba91812056310d68085171f4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:35 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f4e251abeed207ca6218b60c1f736e66593d371ba91812056310d68085171f4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:35 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f4e251abeed207ca6218b60c1f736e66593d371ba91812056310d68085171f4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:35 np0005540826 podman[230451]: 2025-12-01 10:11:35.678407078 +0000 UTC m=+0.098418082 container init 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:11:35 np0005540826 podman[230451]: 2025-12-01 10:11:35.690860875 +0000 UTC m=+0.110871859 container start 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 05:11:35 np0005540826 bash[230451]: 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd
Dec  1 05:11:35 np0005540826 podman[230451]: 2025-12-01 10:11:35.606220094 +0000 UTC m=+0.026231128 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:11:35 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:11:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:35 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:36.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:36.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:38.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:11:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:38.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:11:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:40.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:40.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:41 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:42.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:42 np0005540826 podman[230537]: 2025-12-01 10:11:42.226422105 +0000 UTC m=+0.061590286 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  1 05:11:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:42.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:44.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:44 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  1 05:11:44 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1575990772' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  1 05:11:44 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101144 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:11:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:46.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:46.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.030 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.044 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.045 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.045 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.057 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.057 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.057 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.057 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.057 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.058 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.076 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.077 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.077 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.077 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.077 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:11:47 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:11:47 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2014530076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.542 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.694 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.695 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5281MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.696 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.696 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.752 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.752 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:11:47 np0005540826 nova_compute[229148]: 2025-12-01 10:11:47.770 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000020:nfs.cephfs.0: -2
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:11:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:47 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:11:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:11:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:48.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:11:48 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:11:48 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/977174623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.281 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.287 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.521 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.523 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.524 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.576 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.576 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.576 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:48 np0005540826 nova_compute[229148]: 2025-12-01 10:11:48.577 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:11:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:48.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:48 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:48 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8b0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:49 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c001970 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:49 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:50.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:50.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:50 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:50 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a80013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:51 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101151 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:11:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:51 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:52.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:52.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:52 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:53 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:53 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:54.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:54.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:54 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:54 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:55 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:55 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:56.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:56.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:56 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:56 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:57 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:57 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:58.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:11:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:58.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:58 np0005540826 podman[230624]: 2025-12-01 10:11:58.968653926 +0000 UTC m=+0.053064419 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:11:58 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:58 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:59 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:11:59 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:00.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:00.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:00 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:00 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:01 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:01 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:02.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:02.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:02 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:02 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:03 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:03 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8840023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:04.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:04.543 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:04.543 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:04.544 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:04.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:04 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:04 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:05 np0005540826 podman[230672]: 2025-12-01 10:12:05.02905302 +0000 UTC m=+0.106383084 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:12:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:05 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:05 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:06.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:06.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:06 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:07 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:07 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:12:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:08.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:12:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:08.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:08 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:08 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:09 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:09 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:10.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:10.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:10 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:11 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:11 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:12.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:12.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:12 np0005540826 podman[230702]: 2025-12-01 10:12:12.992516848 +0000 UTC m=+0.065634760 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec  1 05:12:12 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:12 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:13 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:13 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:14.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:14.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:14 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:14 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:15 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff880003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:15 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:16.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:16.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:16 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:16 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:17 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:17 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:18.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:18.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:18 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff89c002290 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:19 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:19 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:20.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:20.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:21 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:21 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:21 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:22.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:22.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:23 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:23 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff88c000fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:23 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:24.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:24.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:25 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a4001930 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:25 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:25 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:26.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:26.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:27 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:27 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:27 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a8004620 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:28.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:28.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:29 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff884004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:29 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a4002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:29 np0005540826 kernel: ganesha.nfsd[230727]: segfault at 50 ip 00007ff95c3f332e sp 00007ff92e7fb210 error 4 in libntirpc.so.5.8[7ff95c3d8000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:12:29 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:12:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[230467]: 01/12/2025 10:12:29 : epoch 692d69d7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff8a4002250 fd 39 proxy ignored for local
Dec  1 05:12:29 np0005540826 systemd[1]: Started Process Core Dump (PID 230757/UID 0).
Dec  1 05:12:29 np0005540826 podman[230758]: 2025-12-01 10:12:29.829934308 +0000 UTC m=+0.062507518 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:12:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:30.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:30 np0005540826 systemd-coredump[230759]: Process 230471 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007ff95c3f332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:12:30 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101230 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:12:30 np0005540826 systemd[1]: systemd-coredump@11-230757-0.service: Deactivated successfully.
Dec  1 05:12:30 np0005540826 systemd[1]: systemd-coredump@11-230757-0.service: Consumed 1.017s CPU time.
Dec  1 05:12:30 np0005540826 podman[230782]: 2025-12-01 10:12:30.87929118 +0000 UTC m=+0.031678214 container died 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:12:30 np0005540826 systemd[1]: var-lib-containers-storage-overlay-6f4e251abeed207ca6218b60c1f736e66593d371ba91812056310d68085171f4-merged.mount: Deactivated successfully.
Dec  1 05:12:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:30.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:30 np0005540826 podman[230782]: 2025-12-01 10:12:30.927555208 +0000 UTC m=+0.079942212 container remove 29c46bbff73daede95d005d13d1e164d4a3762b27bf614a3b087b8f4dd9c08bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:12:30 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:12:31 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:12:31 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.347s CPU time.
Dec  1 05:12:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:12:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:32 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:12:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:32.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:34.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:35 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:35.667 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:12:35 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:35.667 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:12:35 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:12:35.668 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:12:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101235 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:12:36 np0005540826 podman[230911]: 2025-12-01 10:12:36.001024704 +0000 UTC m=+0.079606004 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec  1 05:12:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:36.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:36.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:12:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:38.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:12:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:38.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:40.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:40.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:41 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 12.
Dec  1 05:12:41 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:12:41 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.347s CPU time.
Dec  1 05:12:41 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:12:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:41 np0005540826 podman[231012]: 2025-12-01 10:12:41.368945775 +0000 UTC m=+0.041853150 container create 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:12:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc93c8ac0ee19ee476db17fe6021d26b9d6c74d00b3c03a9edac2b76545cb98f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc93c8ac0ee19ee476db17fe6021d26b9d6c74d00b3c03a9edac2b76545cb98f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc93c8ac0ee19ee476db17fe6021d26b9d6c74d00b3c03a9edac2b76545cb98f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:41 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc93c8ac0ee19ee476db17fe6021d26b9d6c74d00b3c03a9edac2b76545cb98f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:41 np0005540826 podman[231012]: 2025-12-01 10:12:41.429392215 +0000 UTC m=+0.102299610 container init 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:12:41 np0005540826 podman[231012]: 2025-12-01 10:12:41.434325929 +0000 UTC m=+0.107233304 container start 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid)
Dec  1 05:12:41 np0005540826 bash[231012]: 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b
Dec  1 05:12:41 np0005540826 podman[231012]: 2025-12-01 10:12:41.35100367 +0000 UTC m=+0.023911065 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:12:41 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:12:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:41 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:12:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:42.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:43 np0005540826 podman[231096]: 2025-12-01 10:12:43.968047432 +0000 UTC m=+0.054302048 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:12:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:44.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:44.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:45 np0005540826 nova_compute[229148]: 2025-12-01 10:12:45.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:46 np0005540826 nova_compute[229148]: 2025-12-01 10:12:46.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:46.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:46.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.131 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.132 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.132 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.132 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.132 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:12:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:47 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 05:12:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:47 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 05:12:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:47 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:12:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:47 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:12:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:47 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:12:47 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:12:47 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1953192487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.598 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.754 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.755 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5302MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.756 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.756 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.826 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.827 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:12:47 np0005540826 nova_compute[229148]: 2025-12-01 10:12:47.846 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:12:48 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:12:48 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3555826676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:12:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:48.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:48 np0005540826 nova_compute[229148]: 2025-12-01 10:12:48.294 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:12:48 np0005540826 nova_compute[229148]: 2025-12-01 10:12:48.299 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:12:48 np0005540826 nova_compute[229148]: 2025-12-01 10:12:48.320 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:12:48 np0005540826 nova_compute[229148]: 2025-12-01 10:12:48.321 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:12:48 np0005540826 nova_compute[229148]: 2025-12-01 10:12:48.321 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:48.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.322 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.323 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.323 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.323 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.341 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.341 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:49 np0005540826 nova_compute[229148]: 2025-12-01 10:12:49.342 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:12:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:49 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:12:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:50.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:52.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101252 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:12:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:52.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.179005) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974179037, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2357, "num_deletes": 251, "total_data_size": 6456136, "memory_usage": 6541312, "flush_reason": "Manual Compaction"}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974203983, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4178025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20716, "largest_seqno": 23068, "table_properties": {"data_size": 4168505, "index_size": 6014, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19487, "raw_average_key_size": 20, "raw_value_size": 4149481, "raw_average_value_size": 4286, "num_data_blocks": 265, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583754, "oldest_key_time": 1764583754, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 25070 microseconds, and 8210 cpu microseconds.
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.204053) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4178025 bytes OK
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.204097) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.205755) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.205778) EVENT_LOG_v1 {"time_micros": 1764583974205773, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.205802) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6445674, prev total WAL file size 6445674, number of live WAL files 2.
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.207652) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4080KB)], [39(12MB)]
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974207712, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17789281, "oldest_snapshot_seqno": -1}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5485 keys, 15620273 bytes, temperature: kUnknown
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974299784, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15620273, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15580878, "index_size": 24565, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138178, "raw_average_key_size": 25, "raw_value_size": 15478695, "raw_average_value_size": 2822, "num_data_blocks": 1016, "num_entries": 5485, "num_filter_entries": 5485, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:12:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:12:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:54.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.300181) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15620273 bytes
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.301596) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.0 rd, 169.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 13.0 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 6001, records dropped: 516 output_compression: NoCompression
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.301632) EVENT_LOG_v1 {"time_micros": 1764583974301617, "job": 22, "event": "compaction_finished", "compaction_time_micros": 92178, "compaction_time_cpu_micros": 36429, "output_level": 6, "num_output_files": 1, "total_output_size": 15620273, "num_input_records": 6001, "num_output_records": 5485, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974302694, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974304853, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.207549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.304967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.304978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.304983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.304988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:12:54.304990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:54.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000022:nfs.cephfs.0: -2
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f431c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:55 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4304000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:56.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:56.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:57 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f42f0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:57 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f42ec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101257 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:12:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:57 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f43080016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:58.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:12:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:58.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:59 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f43080016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:59 np0005540826 kernel: ganesha.nfsd[231171]: segfault at 50 ip 00007f43c66fd32e sp 00007f4387ffe210 error 4 in libntirpc.so.5.8[7f43c66e2000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  1 05:12:59 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:12:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231028]: 01/12/2025 10:12:59 : epoch 692d6a19 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f43080016e0 fd 38 proxy ignored for local
Dec  1 05:12:59 np0005540826 systemd[1]: Started Process Core Dump (PID 231183/UID 0).
Dec  1 05:12:59 np0005540826 podman[231185]: 2025-12-01 10:12:59.990378846 +0000 UTC m=+0.070444512 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec  1 05:13:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:00.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:00 np0005540826 systemd-coredump[231184]: Process 231032 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007f43c66fd32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:13:00 np0005540826 systemd[1]: systemd-coredump@12-231183-0.service: Deactivated successfully.
Dec  1 05:13:00 np0005540826 systemd[1]: systemd-coredump@12-231183-0.service: Consumed 1.180s CPU time.
Dec  1 05:13:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:00.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:00 np0005540826 podman[231211]: 2025-12-01 10:13:00.978823128 +0000 UTC m=+0.029654168 container died 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 05:13:00 np0005540826 systemd[1]: var-lib-containers-storage-overlay-bc93c8ac0ee19ee476db17fe6021d26b9d6c74d00b3c03a9edac2b76545cb98f-merged.mount: Deactivated successfully.
Dec  1 05:13:01 np0005540826 podman[231211]: 2025-12-01 10:13:01.017735629 +0000 UTC m=+0.068566669 container remove 906ed722a815a1cfd334e3bc1049e4163b4e83ae33db8401cbad245abea1eb1b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 05:13:01 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:13:01 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:13:01 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.316s CPU time.
Dec  1 05:13:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:02.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:02.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:04.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:13:04.544 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:13:04.545 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:13:04.545 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:04.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101305 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:13:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:06.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:06.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:07 np0005540826 podman[231282]: 2025-12-01 10:13:07.056900183 +0000 UTC m=+0.143443437 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  1 05:13:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:08.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:08.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:10.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:10.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:11 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 13.
Dec  1 05:13:11 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:13:11 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.316s CPU time.
Dec  1 05:13:11 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:13:11 np0005540826 podman[231359]: 2025-12-01 10:13:11.615302791 +0000 UTC m=+0.040604547 container create 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec  1 05:13:11 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1fefe4355833a2d48e3d6f82957e49066c00b54f2faa94327822593b8084b8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:13:11 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1fefe4355833a2d48e3d6f82957e49066c00b54f2faa94327822593b8084b8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:13:11 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1fefe4355833a2d48e3d6f82957e49066c00b54f2faa94327822593b8084b8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:13:11 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1fefe4355833a2d48e3d6f82957e49066c00b54f2faa94327822593b8084b8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:13:11 np0005540826 podman[231359]: 2025-12-01 10:13:11.670929391 +0000 UTC m=+0.096231177 container init 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:13:11 np0005540826 podman[231359]: 2025-12-01 10:13:11.675904168 +0000 UTC m=+0.101205934 container start 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True)
Dec  1 05:13:11 np0005540826 bash[231359]: 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19
Dec  1 05:13:11 np0005540826 podman[231359]: 2025-12-01 10:13:11.598016724 +0000 UTC m=+0.023318500 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:13:11 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:13:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:13:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:12.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:12.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:14.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:14.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:15 np0005540826 podman[231418]: 2025-12-01 10:13:15.0071163 +0000 UTC m=+0.081971391 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  1 05:13:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:16.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:16.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:13:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:13:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:18.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:18.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.282924) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000283015, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 479, "num_deletes": 252, "total_data_size": 711178, "memory_usage": 719416, "flush_reason": "Manual Compaction"}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000287244, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 351224, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23073, "largest_seqno": 23547, "table_properties": {"data_size": 348788, "index_size": 536, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6335, "raw_average_key_size": 19, "raw_value_size": 343884, "raw_average_value_size": 1058, "num_data_blocks": 24, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583975, "oldest_key_time": 1764583975, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4381 microseconds, and 2098 cpu microseconds.
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.287318) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 351224 bytes OK
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.287342) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288515) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288538) EVENT_LOG_v1 {"time_micros": 1764584000288531, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288563) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 708269, prev total WAL file size 708269, number of live WAL files 2.
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289149) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(342KB)], [42(14MB)]
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000289180, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15971497, "oldest_snapshot_seqno": -1}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5305 keys, 11953618 bytes, temperature: kUnknown
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000350416, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 11953618, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11919760, "index_size": 19476, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 134850, "raw_average_key_size": 25, "raw_value_size": 11825062, "raw_average_value_size": 2229, "num_data_blocks": 793, "num_entries": 5305, "num_filter_entries": 5305, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.350914) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 11953618 bytes
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.352562) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 260.4 rd, 194.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.9 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(79.5) write-amplify(34.0) OK, records in: 5810, records dropped: 505 output_compression: NoCompression
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.352588) EVENT_LOG_v1 {"time_micros": 1764584000352575, "job": 24, "event": "compaction_finished", "compaction_time_micros": 61326, "compaction_time_cpu_micros": 27168, "output_level": 6, "num_output_files": 1, "total_output_size": 11953618, "num_input_records": 5810, "num_output_records": 5305, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000352864, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000357345, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.357456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.357465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.357467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.357469) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:13:20.357471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:20.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:20.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:22.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:22.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:13:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:13:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:24.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:24.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:26.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:26.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101327 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:13:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:28.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:30.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:30 np0005540826 podman[231487]: 2025-12-01 10:13:30.984277136 +0000 UTC m=+0.067875749 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:13:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:31.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:32.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:33.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:34.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:35.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:36.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:37.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:38 np0005540826 podman[231511]: 2025-12-01 10:13:38.028493794 +0000 UTC m=+0.110390240 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:13:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:38.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:39.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:39 np0005540826 podman[231662]: 2025-12-01 10:13:39.220380439 +0000 UTC m=+0.074811392 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec  1 05:13:39 np0005540826 podman[231662]: 2025-12-01 10:13:39.32279853 +0000 UTC m=+0.177229453 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:13:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:39 np0005540826 podman[231780]: 2025-12-01 10:13:39.808013701 +0000 UTC m=+0.052239211 container exec b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:13:39 np0005540826 podman[231780]: 2025-12-01 10:13:39.816337727 +0000 UTC m=+0.060563227 container exec_died b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:13:40 np0005540826 podman[231869]: 2025-12-01 10:13:40.113819879 +0000 UTC m=+0.051721728 container exec 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:13:40 np0005540826 podman[231869]: 2025-12-01 10:13:40.1231926 +0000 UTC m=+0.061094419 container exec_died 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:13:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:40 np0005540826 podman[231935]: 2025-12-01 10:13:40.321304993 +0000 UTC m=+0.043091375 container exec 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:13:40 np0005540826 podman[231935]: 2025-12-01 10:13:40.35855718 +0000 UTC m=+0.080343532 container exec_died 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:13:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:40.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:40 np0005540826 podman[232001]: 2025-12-01 10:13:40.59947119 +0000 UTC m=+0.055844886 container exec b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=, com.redhat.component=keepalived-container, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=1793, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vcs-type=git, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc.)
Dec  1 05:13:40 np0005540826 podman[232001]: 2025-12-01 10:13:40.614448982 +0000 UTC m=+0.070822658 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  1 05:13:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:41.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:41 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101341 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:13:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:42 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:13:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:42.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:43.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:45.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:45 np0005540826 nova_compute[229148]: 2025-12-01 10:13:45.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:45 np0005540826 podman[232140]: 2025-12-01 10:13:45.98215887 +0000 UTC m=+0.058240452 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec  1 05:13:46 np0005540826 nova_compute[229148]: 2025-12-01 10:13:46.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:47.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:47 np0005540826 nova_compute[229148]: 2025-12-01 10:13:47.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:47 np0005540826 nova_compute[229148]: 2025-12-01 10:13:47.306 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:47 np0005540826 nova_compute[229148]: 2025-12-01 10:13:47.307 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:13:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.123 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.124 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.145 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.147 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.147 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:13:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:48.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:48 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:13:48 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1748997009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:13:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.607 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.757 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.759 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5181MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.760 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.760 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.891 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.892 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:13:48 np0005540826 nova_compute[229148]: 2025-12-01 10:13:48.908 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:13:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:13:49 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3701083602' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:13:49 np0005540826 nova_compute[229148]: 2025-12-01 10:13:49.348 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:13:49 np0005540826 nova_compute[229148]: 2025-12-01 10:13:49.354 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:13:49 np0005540826 nova_compute[229148]: 2025-12-01 10:13:49.374 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:13:49 np0005540826 nova_compute[229148]: 2025-12-01 10:13:49.376 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:13:49 np0005540826 nova_compute[229148]: 2025-12-01 10:13:49.376 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:13:50 np0005540826 nova_compute[229148]: 2025-12-01 10:13:50.362 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:50 np0005540826 nova_compute[229148]: 2025-12-01 10:13:50.362 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:50 np0005540826 nova_compute[229148]: 2025-12-01 10:13:50.362 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:50 np0005540826 nova_compute[229148]: 2025-12-01 10:13:50.362 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:51.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:52 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:13:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:52 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:13:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:53.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:55.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:13:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:56.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:57.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:58.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:13:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:59.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:13:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:00.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:01.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101401 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:02 np0005540826 podman[232238]: 2025-12-01 10:14:02.006182308 +0000 UTC m=+0.089764853 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:14:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:02.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:03.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:04.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:04.545 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:04.546 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:04.546 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:05.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:06.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:07.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:08.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:09 np0005540826 podman[232287]: 2025-12-01 10:14:09.010959651 +0000 UTC m=+0.088672896 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Dec  1 05:14:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:10.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101410 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:11.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:12.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:13.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:14.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:15.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce0009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - - [01/Dec/2025:10:14:15.850 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.000000000s
Dec  1 05:14:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:16.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:16 np0005540826 podman[232318]: 2025-12-01 10:14:16.967315403 +0000 UTC m=+0.055060927 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Dec  1 05:14:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:17.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:18.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:19.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec  1 05:14:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:20.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:20 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:20 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:21.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec  1 05:14:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:22.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:22 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec  1 05:14:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:23.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:24.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec  1 05:14:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:25.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc00041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101425 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:26.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:28.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:29.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:29 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80013a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec  1 05:14:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:30.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:31.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:31 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:32.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:32 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:32 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:32 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:32 np0005540826 podman[232372]: 2025-12-01 10:14:32.984522668 +0000 UTC m=+0.071238787 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:14:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:33.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80013a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:33 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:34.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:35.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.322465) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075322510, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1062, "num_deletes": 256, "total_data_size": 2441875, "memory_usage": 2467152, "flush_reason": "Manual Compaction"}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075336380, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1603094, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23552, "largest_seqno": 24609, "table_properties": {"data_size": 1598144, "index_size": 2474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10461, "raw_average_key_size": 19, "raw_value_size": 1588131, "raw_average_value_size": 2914, "num_data_blocks": 108, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584001, "oldest_key_time": 1764584001, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 14007 microseconds, and 4630 cpu microseconds.
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.336474) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1603094 bytes OK
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.336497) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.338029) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.338047) EVENT_LOG_v1 {"time_micros": 1764584075338042, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.338085) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2436597, prev total WAL file size 2436597, number of live WAL files 2.
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.338898) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1565KB)], [45(11MB)]
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075338964, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13556712, "oldest_snapshot_seqno": -1}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5316 keys, 13364990 bytes, temperature: kUnknown
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075419682, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13364990, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13329360, "index_size": 21248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 136270, "raw_average_key_size": 25, "raw_value_size": 13232742, "raw_average_value_size": 2489, "num_data_blocks": 865, "num_entries": 5316, "num_filter_entries": 5316, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.419964) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13364990 bytes
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.421366) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.8 rd, 165.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.4 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.8) write-amplify(8.3) OK, records in: 5850, records dropped: 534 output_compression: NoCompression
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.421385) EVENT_LOG_v1 {"time_micros": 1764584075421376, "job": 26, "event": "compaction_finished", "compaction_time_micros": 80801, "compaction_time_cpu_micros": 28220, "output_level": 6, "num_output_files": 1, "total_output_size": 13364990, "num_input_records": 5850, "num_output_records": 5316, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075421821, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075424373, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.338783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.424459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.424468) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.424469) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.424470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:14:35.424472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80026e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:35 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:14:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:36.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:37.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:37 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80026e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:38.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:14:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:39.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:14:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:39 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:40 np0005540826 podman[232395]: 2025-12-01 10:14:40.079503812 +0000 UTC m=+0.154457301 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:14:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:40.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:41.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80033f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:41 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:42 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:42 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:42.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:42 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:42.728 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:14:42 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:42.728 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:14:42 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101442 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:43.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd80033f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:43 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:14:43.731 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:14:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:43 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:14:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:45.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.128 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.129 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.129 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:14:45 np0005540826 nova_compute[229148]: 2025-12-01 10:14:45.139 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:45 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:46.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:47.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40040c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:47 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101447 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:47 np0005540826 podman[232450]: 2025-12-01 10:14:47.981277189 +0000 UTC m=+0.062908406 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  1 05:14:48 np0005540826 nova_compute[229148]: 2025-12-01 10:14:48.150 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:48 np0005540826 nova_compute[229148]: 2025-12-01 10:14:48.150 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:48 np0005540826 nova_compute[229148]: 2025-12-01 10:14:48.151 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:14:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:48.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:14:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:14:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:49.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.174 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.175 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.175 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.200 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.201 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.201 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.201 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.201 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:14:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:14:49 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1459856026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.663 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:14:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:49 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:14:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:49 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.858 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.860 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5255MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.860 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:49 np0005540826 nova_compute[229148]: 2025-12-01 10:14:49.860 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.083 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.083 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.146 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.164 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.165 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.210 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.234 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.250 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:14:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:50.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:14:50 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1908518850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.708 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.714 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.737 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.738 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:14:50 np0005540826 nova_compute[229148]: 2025-12-01 10:14:50.739 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:51.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:51 np0005540826 nova_compute[229148]: 2025-12-01 10:14:51.674 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:51 np0005540826 nova_compute[229148]: 2025-12-01 10:14:51.675 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:51 np0005540826 nova_compute[229148]: 2025-12-01 10:14:51.675 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:51 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:52.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:53 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:54.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:55.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:55 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101455 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:56.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:57.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cbc0032b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:57 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:58.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:14:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:14:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:59.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:14:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ce000a7e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:14:59 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:00.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:01.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40041d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:01 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:02.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:03.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:03 np0005540826 podman[232629]: 2025-12-01 10:15:03.233458705 +0000 UTC m=+0.063002219 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:15:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40041d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:03 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:15:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:04.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:04.547 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:04.547 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:04.547 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:05.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:05 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40041d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec  1 05:15:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:06.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:06 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:15:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:06 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:15:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:06 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:15:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:15:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2480622640' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:15:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:15:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2480622640' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:15:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:07.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec  1 05:15:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb0001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:07 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:08.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:09.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:09 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:10 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:10 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:15:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:11 np0005540826 podman[232679]: 2025-12-01 10:15:11.018189143 +0000 UTC m=+0.094242221 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller)
Dec  1 05:15:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:11.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:11 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:12.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca8000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:13 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:14.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:15.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb40042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec  1 05:15:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca80016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:15 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101515 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:15:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:16.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:17.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:17 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca80016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:18.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:18 np0005540826 podman[232711]: 2025-12-01 10:15:18.965616503 +0000 UTC m=+0.051172840 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:15:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:19.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:19 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:20.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:21.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca80016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:21 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:22.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:23.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca8002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:23 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:24.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:25.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cc0002580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cb4004380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:25 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9ca8002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:26.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:27.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[231374]: 01/12/2025 10:15:27 : epoch 692d6a37 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9cd8004a20 fd 48 proxy ignored for local
Dec  1 05:15:27 np0005540826 kernel: ganesha.nfsd[232369]: segfault at 50 ip 00007f9d9060d32e sp 00007f9d637fd210 error 4 in libntirpc.so.5.8[7f9d905f2000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 05:15:27 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:15:27 np0005540826 systemd[1]: Started Process Core Dump (PID 232759/UID 0).
Dec  1 05:15:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:28.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:28 np0005540826 systemd-coredump[232760]: Process 231378 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 60:#012#0  0x00007f9d9060d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:15:29 np0005540826 systemd[1]: systemd-coredump@13-232759-0.service: Deactivated successfully.
Dec  1 05:15:29 np0005540826 systemd[1]: systemd-coredump@13-232759-0.service: Consumed 1.057s CPU time.
Dec  1 05:15:29 np0005540826 podman[232766]: 2025-12-01 10:15:29.077385405 +0000 UTC m=+0.026740663 container died 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 05:15:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:29.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:29 np0005540826 systemd[1]: var-lib-containers-storage-overlay-5b1fefe4355833a2d48e3d6f82957e49066c00b54f2faa94327822593b8084b8-merged.mount: Deactivated successfully.
Dec  1 05:15:29 np0005540826 podman[232766]: 2025-12-01 10:15:29.675966071 +0000 UTC m=+0.625321339 container remove 3fde69acbf8c5ca1571bb25125443a717902fa3adf1a0fda17aa4481bb855d19 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 05:15:29 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:15:29 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:15:29 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.614s CPU time.
Dec  1 05:15:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:30.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:31.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:32.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:33.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101533 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:15:33 np0005540826 podman[232811]: 2025-12-01 10:15:33.983987822 +0000 UTC m=+0.068571708 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:15:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:15:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:34.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:15:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:36.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:37.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:38.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:39.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:39 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 14.
Dec  1 05:15:39 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:15:39 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.614s CPU time.
Dec  1 05:15:39 np0005540826 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:15:40 np0005540826 podman[232879]: 2025-12-01 10:15:40.110406873 +0000 UTC m=+0.025580315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:15:40 np0005540826 podman[232879]: 2025-12-01 10:15:40.237374422 +0000 UTC m=+0.152547834 container create 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:15:40 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895194d5328c9f84f7196a01204a02006bbf476e63b8591e08eccaf668a19aa5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:15:40 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895194d5328c9f84f7196a01204a02006bbf476e63b8591e08eccaf668a19aa5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:15:40 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895194d5328c9f84f7196a01204a02006bbf476e63b8591e08eccaf668a19aa5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:15:40 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895194d5328c9f84f7196a01204a02006bbf476e63b8591e08eccaf668a19aa5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.osfnzc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:15:40 np0005540826 podman[232879]: 2025-12-01 10:15:40.450813298 +0000 UTC m=+0.365986740 container init 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid)
Dec  1 05:15:40 np0005540826 podman[232879]: 2025-12-01 10:15:40.456199557 +0000 UTC m=+0.371372969 container start 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:15:40 np0005540826 bash[232879]: 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb
Dec  1 05:15:40 np0005540826 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:15:40 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:40 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:15:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:40.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:41.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:41 np0005540826 podman[232937]: 2025-12-01 10:15:41.996128681 +0000 UTC m=+0.080530345 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:15:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:42.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:43.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:44.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:45 np0005540826 nova_compute[229148]: 2025-12-01 10:15:45.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:45.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:46.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:46 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:15:46 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:46 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:15:46 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:46.737 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:15:46 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:46.738 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:15:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:47.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:48 np0005540826 nova_compute[229148]: 2025-12-01 10:15:48.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:48 np0005540826 nova_compute[229148]: 2025-12-01 10:15:48.122 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:48.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:49 np0005540826 nova_compute[229148]: 2025-12-01 10:15:49.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:49 np0005540826 nova_compute[229148]: 2025-12-01 10:15:49.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:15:49 np0005540826 nova_compute[229148]: 2025-12-01 10:15:49.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:15:49 np0005540826 nova_compute[229148]: 2025-12-01 10:15:49.166 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:15:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:49.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:49 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:15:49.740 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:15:49 np0005540826 podman[232992]: 2025-12-01 10:15:49.968954055 +0000 UTC m=+0.049701975 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 05:15:50 np0005540826 nova_compute[229148]: 2025-12-01 10:15:50.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:50 np0005540826 nova_compute[229148]: 2025-12-01 10:15:50.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:50 np0005540826 nova_compute[229148]: 2025-12-01 10:15:50.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:15:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:50.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.136 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.137 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.137 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:51.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:15:51 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3930119811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:15:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.607 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.744 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.745 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5248MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.745 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.746 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.797 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.798 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:15:51 np0005540826 nova_compute[229148]: 2025-12-01 10:15:51.818 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:15:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/410573799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:15:52 np0005540826 nova_compute[229148]: 2025-12-01 10:15:52.270 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:52 np0005540826 nova_compute[229148]: 2025-12-01 10:15:52.276 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:15:52 np0005540826 nova_compute[229148]: 2025-12-01 10:15:52.297 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:15:52 np0005540826 nova_compute[229148]: 2025-12-01 10:15:52.299 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:15:52 np0005540826 nova_compute[229148]: 2025-12-01 10:15:52.299 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:52.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:15:52 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:52 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:15:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:53.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.295 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.417 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.418 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.432 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.508 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.508 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.513 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.514 229152 INFO nova.compute.claims [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:15:53 np0005540826 nova_compute[229148]: 2025-12-01 10:15:53.610 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:15:54 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2861935731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.080 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.088 229152 DEBUG nova.compute.provider_tree [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.103 229152 DEBUG nova.scheduler.client.report [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.127 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.128 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.177 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.177 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.201 229152 INFO nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.216 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.308 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.309 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.310 229152 INFO nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Creating image(s)#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.341 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.372 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.403 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.408 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.410 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:54.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.715 229152 WARNING oslo_policy.policy [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.717 229152 WARNING oslo_policy.policy [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.719 229152 DEBUG nova.policy [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:15:54 np0005540826 nova_compute[229148]: 2025-12-01 10:15:54.748 229152 DEBUG nova.virt.libvirt.imagebackend [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image locations are: [{'url': 'rbd://365f19c2-81e5-5edd-b6b4-280555214d3a/images/8f75d6de-6ce0-44e1-b417-d0111424475b/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://365f19c2-81e5-5edd-b6b4-280555214d3a/images/8f75d6de-6ce0-44e1-b417-d0111424475b/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Dec  1 05:15:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:15:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:55.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:15:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.890 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.908 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Successfully created port: 88d90ea0-2620-45cd-9740-631b9db0c778 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.949 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.part --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.950 229152 DEBUG nova.virt.images [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] 8f75d6de-6ce0-44e1-b417-d0111424475b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.951 229152 DEBUG nova.privsep.utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  1 05:15:55 np0005540826 nova_compute[229148]: 2025-12-01 10:15:55.951 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.part /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.118 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.part /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.converted" returned: 0 in 0.167s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.123 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.176 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34.converted --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.177 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.200 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.203 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.484 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.560 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:15:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:56.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.667 229152 DEBUG nova.objects.instance [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid 340ca5cd-1261-42f0-9157-5cd4fad54b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.688 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.689 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Ensure instance console log exists: /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.689 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.690 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:56 np0005540826 nova_compute[229148]: 2025-12-01 10:15:56.690 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.084 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Successfully updated port: 88d90ea0-2620-45cd-9740-631b9db0c778 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:15:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:57.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.271 229152 DEBUG nova.compute.manager [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-changed-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.271 229152 DEBUG nova.compute.manager [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Refreshing instance network info cache due to event network-changed-88d90ea0-2620-45cd-9740-631b9db0c778. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.272 229152 DEBUG oslo_concurrency.lockutils [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.272 229152 DEBUG oslo_concurrency.lockutils [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.272 229152 DEBUG nova.network.neutron [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Refreshing network info cache for port 88d90ea0-2620-45cd-9740-631b9db0c778 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.274 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.595 229152 DEBUG nova.network.neutron [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:15:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:57 np0005540826 nova_compute[229148]: 2025-12-01 10:15:57.919 229152 DEBUG nova.network.neutron [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:15:58 np0005540826 nova_compute[229148]: 2025-12-01 10:15:58.063 229152 DEBUG oslo_concurrency.lockutils [req-6f4750ab-9bde-452f-b4bd-57a8aabd4194 req-1a58bb7b-540a-4d25-82c9-0d500527b408 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:15:58 np0005540826 nova_compute[229148]: 2025-12-01 10:15:58.064 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:15:58 np0005540826 nova_compute[229148]: 2025-12-01 10:15:58.064 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:15:58 np0005540826 nova_compute[229148]: 2025-12-01 10:15:58.206 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:15:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:15:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:58.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:15:58 np0005540826 nova_compute[229148]: 2025-12-01 10:15:58.977 229152 DEBUG nova.network.neutron [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Updating instance_info_cache with network_info: [{"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.047 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-340ca5cd-1261-42f0-9157-5cd4fad54b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.047 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance network_info: |[{"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.050 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Start _get_guest_xml network_info=[{"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.055 229152 WARNING nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.059 229152 DEBUG nova.virt.libvirt.host [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.059 229152 DEBUG nova.virt.libvirt.host [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.062 229152 DEBUG nova.virt.libvirt.host [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.063 229152 DEBUG nova.virt.libvirt.host [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.063 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.063 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.064 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.064 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.064 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.065 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.065 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.065 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.066 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.066 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.066 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.066 229152 DEBUG nova.virt.hardware [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.070 229152 DEBUG nova.privsep.utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.071 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:15:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:59.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3782279426' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.522 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.551 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:15:59 np0005540826 nova_compute[229148]: 2025-12-01 10:15:59.555 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:15:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:15:59 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:16:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/888313626' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.036 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.040 229152 DEBUG nova.virt.libvirt.vif [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-12964730',display_name='tempest-TestNetworkBasicOps-server-12964730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-12964730',id=2,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCv5Ip5yIpGGSLrMU8LbJnpxZWACtK+hdg+ITbgbUfUMCkNJQKRZ8cD2eVi2crbelQ+qVMGzuSooy6ybCfyfCMLsGCSIrlE7jMMwO9xDQDOU4r5aBuK0vSncn0meVyEDag==',key_name='tempest-TestNetworkBasicOps-1464321504',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-eioa7ghm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:15:54Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=340ca5cd-1261-42f0-9157-5cd4fad54b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.040 229152 DEBUG nova.network.os_vif_util [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.042 229152 DEBUG nova.network.os_vif_util [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.044 229152 DEBUG nova.objects.instance [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 340ca5cd-1261-42f0-9157-5cd4fad54b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.125 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <uuid>340ca5cd-1261-42f0-9157-5cd4fad54b63</uuid>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <name>instance-00000002</name>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-12964730</nova:name>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:15:59</nova:creationTime>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <nova:port uuid="88d90ea0-2620-45cd-9740-631b9db0c778">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="serial">340ca5cd-1261-42f0-9157-5cd4fad54b63</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="uuid">340ca5cd-1261-42f0-9157-5cd4fad54b63</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/340ca5cd-1261-42f0-9157-5cd4fad54b63_disk">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:68:11:60"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <target dev="tap88d90ea0-26"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/console.log" append="off"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:16:00 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:16:00 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:16:00 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:16:00 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.127 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Preparing to wait for external event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.127 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.128 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.128 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.129 229152 DEBUG nova.virt.libvirt.vif [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-12964730',display_name='tempest-TestNetworkBasicOps-server-12964730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-12964730',id=2,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCv5Ip5yIpGGSLrMU8LbJnpxZWACtK+hdg+ITbgbUfUMCkNJQKRZ8cD2eVi2crbelQ+qVMGzuSooy6ybCfyfCMLsGCSIrlE7jMMwO9xDQDOU4r5aBuK0vSncn0meVyEDag==',key_name='tempest-TestNetworkBasicOps-1464321504',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-eioa7ghm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:15:54Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=340ca5cd-1261-42f0-9157-5cd4fad54b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.129 229152 DEBUG nova.network.os_vif_util [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.129 229152 DEBUG nova.network.os_vif_util [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.130 229152 DEBUG os_vif [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.166 229152 DEBUG ovsdbapp.backend.ovs_idl [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.167 229152 DEBUG ovsdbapp.backend.ovs_idl [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.167 229152 DEBUG ovsdbapp.backend.ovs_idl [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.167 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.168 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.168 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.169 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.170 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.172 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.185 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.185 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.186 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.187 229152 INFO oslo.privsep.daemon [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpn5u9f3go/privsep.sock']#033[00m
Dec  1 05:16:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:00.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.984 229152 INFO oslo.privsep.daemon [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.769 233423 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.774 233423 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.776 233423 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec  1 05:16:00 np0005540826 nova_compute[229148]: 2025-12-01 10:16:00.776 233423 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233423#033[00m
Dec  1 05:16:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:16:01 np0005540826 ceph-mon[80026]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec  1 05:16:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:01.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.332 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.333 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88d90ea0-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.334 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88d90ea0-26, col_values=(('external_ids', {'iface-id': '88d90ea0-2620-45cd-9740-631b9db0c778', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:11:60', 'vm-uuid': '340ca5cd-1261-42f0-9157-5cd4fad54b63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.336 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:01 np0005540826 NetworkManager[48989]: <info>  [1764584161.3371] manager: (tap88d90ea0-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.340 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.344 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.346 229152 INFO os_vif [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26')#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.478 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.479 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.479 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:68:11:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.480 229152 INFO nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Using config drive#033[00m
Dec  1 05:16:01 np0005540826 nova_compute[229148]: 2025-12-01 10:16:01.508 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:16:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:02.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:02 np0005540826 nova_compute[229148]: 2025-12-01 10:16:02.768 229152 INFO nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Creating config drive at /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config#033[00m
Dec  1 05:16:02 np0005540826 nova_compute[229148]: 2025-12-01 10:16:02.773 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp95_xxiry execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:02 np0005540826 nova_compute[229148]: 2025-12-01 10:16:02.904 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp95_xxiry" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:02 np0005540826 nova_compute[229148]: 2025-12-01 10:16:02.934 229152 DEBUG nova.storage.rbd_utils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:16:02 np0005540826 nova_compute[229148]: 2025-12-01 10:16:02.937 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.085 229152 DEBUG oslo_concurrency.processutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config 340ca5cd-1261-42f0-9157-5cd4fad54b63_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.086 229152 INFO nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Deleting local config drive /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63/disk.config because it was imported into RBD.#033[00m
Dec  1 05:16:03 np0005540826 systemd[1]: Starting libvirt secret daemon...
Dec  1 05:16:03 np0005540826 systemd[1]: Started libvirt secret daemon.
Dec  1 05:16:03 np0005540826 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec  1 05:16:03 np0005540826 kernel: tap88d90ea0-26: entered promiscuous mode
Dec  1 05:16:03 np0005540826 NetworkManager[48989]: <info>  [1764584163.1902] manager: (tap88d90ea0-26): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Dec  1 05:16:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:03Z|00027|binding|INFO|Claiming lport 88d90ea0-2620-45cd-9740-631b9db0c778 for this chassis.
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.191 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:03Z|00028|binding|INFO|88d90ea0-2620-45cd-9740-631b9db0c778: Claiming fa:16:3e:68:11:60 10.100.0.24
Dec  1 05:16:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:03 np0005540826 systemd-udevd[233522]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:16:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:03.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:03 np0005540826 NetworkManager[48989]: <info>  [1764584163.2378] device (tap88d90ea0-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:16:03 np0005540826 NetworkManager[48989]: <info>  [1764584163.2388] device (tap88d90ea0-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.240 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.245 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:03Z|00029|binding|INFO|Setting lport 88d90ea0-2620-45cd-9740-631b9db0c778 ovn-installed in OVS
Dec  1 05:16:03 np0005540826 systemd-machined[192474]: New machine qemu-1-instance-00000002.
Dec  1 05:16:03 np0005540826 nova_compute[229148]: 2025-12-01 10:16:03.247 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:03 np0005540826 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Dec  1 05:16:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:03Z|00030|binding|INFO|Setting lport 88d90ea0-2620-45cd-9740-631b9db0c778 up in Southbound
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.288 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:11:60 10.100.0.24'], port_security=['fa:16:3e:68:11:60 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '340ca5cd-1261-42f0-9157-5cd4fad54b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1f410c70-f8b1-4f07-81a9-554fe3430914', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b5ae61-015c-4486-839e-601cac6764c5, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=88d90ea0-2620-45cd-9740-631b9db0c778) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.289 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 88d90ea0-2620-45cd-9740-631b9db0c778 in datapath 01e9ffa8-19a6-4f5b-88bd-bb167984b351 bound to our chassis#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.291 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01e9ffa8-19a6-4f5b-88bd-bb167984b351#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.292 141685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpqv502e1n/privsep.sock']#033[00m
Dec  1 05:16:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.955 141685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.956 141685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpqv502e1n/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.829 233565 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.837 233565 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.841 233565 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.841 233565 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233565#033[00m
Dec  1 05:16:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:03.958 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[060bf056-f284-4f1c-b1c5-e5e28eb7b8b1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.163 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.208 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584164.208058, 340ca5cd-1261-42f0-9157-5cd4fad54b63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.209 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] VM Started (Lifecycle Event)#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.283 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.287 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584164.2084606, 340ca5cd-1261-42f0-9157-5cd4fad54b63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.288 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.464 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.468 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:16:04 np0005540826 nova_compute[229148]: 2025-12-01 10:16:04.510 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.534 233565 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.534 233565 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.534 233565 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.547 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.548 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:04.549 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:04.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:04 np0005540826 podman[233613]: 2025-12-01 10:16:04.978056542 +0000 UTC m=+0.060059373 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.173 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[f6daae05-3856-496b-8a7b-87673ab03f45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.174 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01e9ffa8-11 in ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.175 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01e9ffa8-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.176 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[f17172a9-d460-48db-95b6-ba0ff6121c0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.179 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf60337-9036-40a4-a491-6c859fc689a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.205 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[9371a7a9-258d-4d4c-838e-c5e4ab699aa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.226 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[232ad537-a8e4-4ad5-939e-37c9e0da5c37]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.228 141685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1bjaowuz/privsep.sock']#033[00m
Dec  1 05:16:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:05.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30002b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.866 141685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.866 141685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1bjaowuz/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.743 233643 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.747 233643 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.749 233643 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.749 233643 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233643#033[00m
Dec  1 05:16:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:05.869 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b4f022-d995-4a60-ad4d-528b61ed60da]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:06.366 233643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:06.366 233643 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:06.366 233643 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:06 np0005540826 nova_compute[229148]: 2025-12-01 10:16:06.374 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:16:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:06.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:16:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:06 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:16:06 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:06 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:16:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:06.944 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[12a65dc9-08c3-4501-b51b-fc6e5622b15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:06 np0005540826 NetworkManager[48989]: <info>  [1764584166.9728] manager: (tap01e9ffa8-10): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Dec  1 05:16:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:06.971 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ddecc4df-362b-4486-9b16-d9508b631164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 systemd-udevd[233681]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.006 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[47622ed2-9b44-4888-ba57-3fccd6def165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.009 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[69846368-eded-47bf-8fa4-b7312596e6ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:07 np0005540826 NetworkManager[48989]: <info>  [1764584167.0417] device (tap01e9ffa8-10): carrier: link connected
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.050 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[481e5276-8e1c-4e8d-b528-93b45e853f80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.074 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[07130d7c-90ab-4dcb-8108-4b05f7b6adc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e9ffa8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:03:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402956, 'reachable_time': 19773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233699, 'error': None, 'target': 'ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.094 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f64596-9f1e-4545-be1d-ddb7a056747f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:31e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402956, 'tstamp': 402956}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233700, 'error': None, 'target': 'ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3226630919' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:16:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3226630919' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.113 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[682b2af4-2f82-4bdb-bad7-8ef8ae0ec3da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01e9ffa8-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:03:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402956, 'reachable_time': 19773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233701, 'error': None, 'target': 'ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.149 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[dc81d590-c832-4e70-8658-1760553f6939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.212 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[163c6e87-989f-41dc-ad2f-7dc5a19e7da5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.214 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e9ffa8-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.214 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.215 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01e9ffa8-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.217 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:07 np0005540826 kernel: tap01e9ffa8-10: entered promiscuous mode
Dec  1 05:16:07 np0005540826 NetworkManager[48989]: <info>  [1764584167.2179] manager: (tap01e9ffa8-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.219 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.221 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01e9ffa8-10, col_values=(('external_ids', {'iface-id': '2359edd9-0d76-4c74-9ef2-be72e6137c42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.222 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:07 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:07Z|00031|binding|INFO|Releasing lport 2359edd9-0d76-4c74-9ef2-be72e6137c42 from this chassis (sb_readonly=0)
Dec  1 05:16:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.235 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:07.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.236 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01e9ffa8-19a6-4f5b-88bd-bb167984b351.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01e9ffa8-19a6-4f5b-88bd-bb167984b351.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.237 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbe5e59-97db-411e-9339-83303034e550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.238 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-01e9ffa8-19a6-4f5b-88bd-bb167984b351
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/01e9ffa8-19a6-4f5b-88bd-bb167984b351.pid.haproxy
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 01e9ffa8-19a6-4f5b-88bd-bb167984b351
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:16:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:07.239 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'env', 'PROCESS_TAG=haproxy-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01e9ffa8-19a6-4f5b-88bd-bb167984b351.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.491 229152 DEBUG nova.compute.manager [req-bd788faa-d7b8-4832-bcb8-7533dbe51376 req-4880952b-3fa3-412c-8ca3-81c26f0461d5 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.491 229152 DEBUG oslo_concurrency.lockutils [req-bd788faa-d7b8-4832-bcb8-7533dbe51376 req-4880952b-3fa3-412c-8ca3-81c26f0461d5 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.492 229152 DEBUG oslo_concurrency.lockutils [req-bd788faa-d7b8-4832-bcb8-7533dbe51376 req-4880952b-3fa3-412c-8ca3-81c26f0461d5 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.492 229152 DEBUG oslo_concurrency.lockutils [req-bd788faa-d7b8-4832-bcb8-7533dbe51376 req-4880952b-3fa3-412c-8ca3-81c26f0461d5 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.492 229152 DEBUG nova.compute.manager [req-bd788faa-d7b8-4832-bcb8-7533dbe51376 req-4880952b-3fa3-412c-8ca3-81c26f0461d5 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Processing event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.493 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.497 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584167.49762, 340ca5cd-1261-42f0-9157-5cd4fad54b63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.498 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.508 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.516 229152 INFO nova.virt.libvirt.driver [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance spawned successfully.#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.516 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.583 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.587 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:16:07 np0005540826 podman[233733]: 2025-12-01 10:16:07.606802335 +0000 UTC m=+0.049490170 container create 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:16:07 np0005540826 systemd[1]: Started libpod-conmon-0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17.scope.
Dec  1 05:16:07 np0005540826 podman[233733]: 2025-12-01 10:16:07.579784656 +0000 UTC m=+0.022472491 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.688 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:16:07 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.700 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.700 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.701 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.701 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.701 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.702 229152 DEBUG nova.virt.libvirt.driver [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:16:07 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3235d0782b8f01654c2258d5734313ad5e50be4a42eff1a1b19add741bfa63bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:16:07 np0005540826 podman[233733]: 2025-12-01 10:16:07.718770934 +0000 UTC m=+0.161458799 container init 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:16:07 np0005540826 podman[233733]: 2025-12-01 10:16:07.725889985 +0000 UTC m=+0.168577820 container start 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  1 05:16:07 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [NOTICE]   (233752) : New worker (233754) forked
Dec  1 05:16:07 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [NOTICE]   (233752) : Loading success.
Dec  1 05:16:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.832 229152 INFO nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Took 13.52 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.834 229152 DEBUG nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:16:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30002b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.934 229152 INFO nova.compute.manager [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Took 14.45 seconds to build instance.#033[00m
Dec  1 05:16:07 np0005540826 nova_compute[229148]: 2025-12-01 10:16:07.969 229152 DEBUG oslo_concurrency.lockutils [None req-b53932bc-f669-4461-a61c-8faef33c1460 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:08.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.165 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30002b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:09.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.581 229152 DEBUG nova.compute.manager [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.582 229152 DEBUG oslo_concurrency.lockutils [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.582 229152 DEBUG oslo_concurrency.lockutils [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.582 229152 DEBUG oslo_concurrency.lockutils [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.583 229152 DEBUG nova.compute.manager [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] No waiting events found dispatching network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:16:09 np0005540826 nova_compute[229148]: 2025-12-01 10:16:09.583 229152 WARNING nova.compute.manager [req-e2325559-dafa-401a-8c45-aa0800108667 req-dff2a067-493d-49d1-9e35-cfec8af78938 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received unexpected event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:16:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:16:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:10.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:16:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:11.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:16:11 np0005540826 nova_compute[229148]: 2025-12-01 10:16:11.376 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30002b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:12.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:13 np0005540826 podman[233766]: 2025-12-01 10:16:13.01631184 +0000 UTC m=+0.094607124 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  1 05:16:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:13.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:14 np0005540826 nova_compute[229148]: 2025-12-01 10:16:14.167 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.375140) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174375258, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1349, "num_deletes": 251, "total_data_size": 3226284, "memory_usage": 3275840, "flush_reason": "Manual Compaction"}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174387656, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2079191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24614, "largest_seqno": 25958, "table_properties": {"data_size": 2073500, "index_size": 3022, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12889, "raw_average_key_size": 20, "raw_value_size": 2061638, "raw_average_value_size": 3221, "num_data_blocks": 135, "num_entries": 640, "num_filter_entries": 640, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584076, "oldest_key_time": 1764584076, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 12549 microseconds, and 5787 cpu microseconds.
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.387720) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2079191 bytes OK
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.387742) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.389613) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.389633) EVENT_LOG_v1 {"time_micros": 1764584174389628, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.389654) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 3219842, prev total WAL file size 3219842, number of live WAL files 2.
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390526) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2030KB)], [48(12MB)]
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174390562, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15444181, "oldest_snapshot_seqno": -1}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5435 keys, 13261095 bytes, temperature: kUnknown
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174452325, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13261095, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13224829, "index_size": 21564, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139487, "raw_average_key_size": 25, "raw_value_size": 13126223, "raw_average_value_size": 2415, "num_data_blocks": 876, "num_entries": 5435, "num_filter_entries": 5435, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.452605) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13261095 bytes
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.454014) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 249.7 rd, 214.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.7 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.8) write-amplify(6.4) OK, records in: 5956, records dropped: 521 output_compression: NoCompression
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.454035) EVENT_LOG_v1 {"time_micros": 1764584174454024, "job": 28, "event": "compaction_finished", "compaction_time_micros": 61856, "compaction_time_cpu_micros": 27068, "output_level": 6, "num_output_files": 1, "total_output_size": 13261095, "num_input_records": 5956, "num_output_records": 5435, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174454451, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174457203, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.457271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.457278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.457280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.457282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:16:14.457284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:14.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:15.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:16 np0005540826 nova_compute[229148]: 2025-12-01 10:16:16.409 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:16.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8503] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8508] device (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8523] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8525] device (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8533] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8538] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8541] device (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  1 05:16:16 np0005540826 NetworkManager[48989]: <info>  [1764584176.8543] device (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  1 05:16:16 np0005540826 nova_compute[229148]: 2025-12-01 10:16:16.854 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:16 np0005540826 nova_compute[229148]: 2025-12-01 10:16:16.927 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:16 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:16Z|00032|binding|INFO|Releasing lport 2359edd9-0d76-4c74-9ef2-be72e6137c42 from this chassis (sb_readonly=0)
Dec  1 05:16:16 np0005540826 nova_compute[229148]: 2025-12-01 10:16:16.934 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101617 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:16:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:17.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:18.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:19 np0005540826 nova_compute[229148]: 2025-12-01 10:16:19.170 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:19 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:16:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:19.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:20.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:20 np0005540826 podman[233798]: 2025-12-01 10:16:20.986424635 +0000 UTC m=+0.063313977 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 05:16:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:21 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:21.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:21 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:21Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:11:60 10.100.0.24
Dec  1 05:16:21 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:21Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:11:60 10.100.0.24
Dec  1 05:16:21 np0005540826 nova_compute[229148]: 2025-12-01 10:16:21.411 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:21 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:21 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:21 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:22.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:23 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:23.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:23 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:23 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:23 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:24 np0005540826 nova_compute[229148]: 2025-12-01 10:16:24.173 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:24.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:25 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:25.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:25 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:25 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:25 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:26 np0005540826 nova_compute[229148]: 2025-12-01 10:16:26.414 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:26.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:27 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:27.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:27 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:27 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:28.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.036 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.037 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.037 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.037 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.038 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.039 229152 INFO nova.compute.manager [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Terminating instance#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.040 229152 DEBUG nova.compute.manager [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 05:16:29 np0005540826 kernel: tap88d90ea0-26 (unregistering): left promiscuous mode
Dec  1 05:16:29 np0005540826 NetworkManager[48989]: <info>  [1764584189.1018] device (tap88d90ea0-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:16:29 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:29Z|00033|binding|INFO|Releasing lport 88d90ea0-2620-45cd-9740-631b9db0c778 from this chassis (sb_readonly=0)
Dec  1 05:16:29 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:29Z|00034|binding|INFO|Setting lport 88d90ea0-2620-45cd-9740-631b9db0c778 down in Southbound
Dec  1 05:16:29 np0005540826 ovn_controller[132309]: 2025-12-01T10:16:29Z|00035|binding|INFO|Removing iface tap88d90ea0-26 ovn-installed in OVS
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.114 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.124 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:11:60 10.100.0.24'], port_security=['fa:16:3e:68:11:60 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '340ca5cd-1261-42f0-9157-5cd4fad54b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1f410c70-f8b1-4f07-81a9-554fe3430914', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b5ae61-015c-4486-839e-601cac6764c5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=88d90ea0-2620-45cd-9740-631b9db0c778) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.127 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 88d90ea0-2620-45cd-9740-631b9db0c778 in datapath 01e9ffa8-19a6-4f5b-88bd-bb167984b351 unbound from our chassis#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.128 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01e9ffa8-19a6-4f5b-88bd-bb167984b351, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.130 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a25fbf3f-4e59-4224-837d-33cd5bcde118]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.131 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351 namespace which is not needed anymore#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.133 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.174 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec  1 05:16:29 np0005540826 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 14.020s CPU time.
Dec  1 05:16:29 np0005540826 systemd-machined[192474]: Machine qemu-1-instance-00000002 terminated.
Dec  1 05:16:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:29 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:29.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:29 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [NOTICE]   (233752) : haproxy version is 2.8.14-c23fe91
Dec  1 05:16:29 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [NOTICE]   (233752) : path to executable is /usr/sbin/haproxy
Dec  1 05:16:29 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [WARNING]  (233752) : Exiting Master process...
Dec  1 05:16:29 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [ALERT]    (233752) : Current worker (233754) exited with code 143 (Terminated)
Dec  1 05:16:29 np0005540826 neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351[233748]: [WARNING]  (233752) : All workers exited. Exiting... (0)
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.281 229152 INFO nova.virt.libvirt.driver [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Instance destroyed successfully.#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.281 229152 DEBUG nova.objects.instance [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'resources' on Instance uuid 340ca5cd-1261-42f0-9157-5cd4fad54b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:16:29 np0005540826 systemd[1]: libpod-0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17.scope: Deactivated successfully.
Dec  1 05:16:29 np0005540826 podman[233874]: 2025-12-01 10:16:29.29001229 +0000 UTC m=+0.057383004 container died 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.295 229152 DEBUG nova.virt.libvirt.vif [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-12964730',display_name='tempest-TestNetworkBasicOps-server-12964730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-12964730',id=2,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCv5Ip5yIpGGSLrMU8LbJnpxZWACtK+hdg+ITbgbUfUMCkNJQKRZ8cD2eVi2crbelQ+qVMGzuSooy6ybCfyfCMLsGCSIrlE7jMMwO9xDQDOU4r5aBuK0vSncn0meVyEDag==',key_name='tempest-TestNetworkBasicOps-1464321504',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:16:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-eioa7ghm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:16:07Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=340ca5cd-1261-42f0-9157-5cd4fad54b63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.296 229152 DEBUG nova.network.os_vif_util [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "88d90ea0-2620-45cd-9740-631b9db0c778", "address": "fa:16:3e:68:11:60", "network": {"id": "01e9ffa8-19a6-4f5b-88bd-bb167984b351", "bridge": "br-int", "label": "tempest-network-smoke--1887862550", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d90ea0-26", "ovs_interfaceid": "88d90ea0-2620-45cd-9740-631b9db0c778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.297 229152 DEBUG nova.network.os_vif_util [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.297 229152 DEBUG os_vif [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.300 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.301 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88d90ea0-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.306 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.309 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.312 229152 INFO os_vif [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:11:60,bridge_name='br-int',has_traffic_filtering=True,id=88d90ea0-2620-45cd-9740-631b9db0c778,network=Network(01e9ffa8-19a6-4f5b-88bd-bb167984b351),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d90ea0-26')#033[00m
Dec  1 05:16:29 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17-userdata-shm.mount: Deactivated successfully.
Dec  1 05:16:29 np0005540826 systemd[1]: var-lib-containers-storage-overlay-3235d0782b8f01654c2258d5734313ad5e50be4a42eff1a1b19add741bfa63bd-merged.mount: Deactivated successfully.
Dec  1 05:16:29 np0005540826 podman[233874]: 2025-12-01 10:16:29.344865001 +0000 UTC m=+0.112235715 container cleanup 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:16:29 np0005540826 systemd[1]: libpod-conmon-0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17.scope: Deactivated successfully.
Dec  1 05:16:29 np0005540826 podman[233931]: 2025-12-01 10:16:29.424521394 +0000 UTC m=+0.051152902 container remove 0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.430 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e426de-2654-4248-ad87-ae97b4fc2c65]: (4, ('Mon Dec  1 10:16:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351 (0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17)\n0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17\nMon Dec  1 10:16:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351 (0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17)\n0bb203173c1c1f7b49526f01d606f3805527963f31bdae67179df5a11921ea17\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.433 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6761af-0c02-4d0f-8199-d151b0c337e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.434 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e9ffa8-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.436 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 kernel: tap01e9ffa8-10: left promiscuous mode
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.449 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.451 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.454 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ba98f681-5404-4d16-ab15-1880d3d6e8a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.466 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2c637b-3296-4835-a1c1-0a73b5dd9351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.468 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3e09cc-7966-48fa-a99d-1a60304e7d19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.487 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[c9fbb293-635a-4cca-a7ad-5aab694ba3d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402946, 'reachable_time': 30262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233949, 'error': None, 'target': 'ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 systemd[1]: run-netns-ovnmeta\x2d01e9ffa8\x2d19a6\x2d4f5b\x2d88bd\x2dbb167984b351.mount: Deactivated successfully.
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.500 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01e9ffa8-19a6-4f5b-88bd-bb167984b351 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:16:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:29.501 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5a154e-12f0-40d5-a01f-19929294bacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.808 229152 DEBUG nova.compute.manager [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-unplugged-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.810 229152 DEBUG oslo_concurrency.lockutils [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.810 229152 DEBUG oslo_concurrency.lockutils [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.810 229152 DEBUG oslo_concurrency.lockutils [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.811 229152 DEBUG nova.compute.manager [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] No waiting events found dispatching network-vif-unplugged-88d90ea0-2620-45cd-9740-631b9db0c778 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:16:29 np0005540826 nova_compute[229148]: 2025-12-01 10:16:29.811 229152 DEBUG nova.compute.manager [req-f24db8f4-9f0c-4c82-ae94-073c13906b14 req-393e0db2-4793-48d5-bf73-db860a4e7ab7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-unplugged-88d90ea0-2620-45cd-9740-631b9db0c778 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  1 05:16:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:29 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:29 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:29 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.349 229152 INFO nova.virt.libvirt.driver [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Deleting instance files /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63_del#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.350 229152 INFO nova.virt.libvirt.driver [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Deletion of /var/lib/nova/instances/340ca5cd-1261-42f0-9157-5cd4fad54b63_del complete#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.424 229152 DEBUG nova.virt.libvirt.host [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.424 229152 INFO nova.virt.libvirt.host [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] UEFI support detected#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.427 229152 INFO nova.compute.manager [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Took 1.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.427 229152 DEBUG oslo.service.loopingcall [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.428 229152 DEBUG nova.compute.manager [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.428 229152 DEBUG nova.network.neutron [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 05:16:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:30.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.958 229152 DEBUG nova.network.neutron [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:16:30 np0005540826 nova_compute[229148]: 2025-12-01 10:16:30.972 229152 INFO nova.compute.manager [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Took 0.54 seconds to deallocate network for instance.#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.020 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.021 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.080 229152 DEBUG oslo_concurrency.processutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:31 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:31.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:31 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/391728244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.578 229152 DEBUG oslo_concurrency.processutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.584 229152 DEBUG nova.compute.provider_tree [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:16:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.636 229152 ERROR nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [req-68b5677b-00d3-40af-ad51-f7188847bb86] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 19014d04-db84-4f3d-831b-084720e9168c.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-68b5677b-00d3-40af-ad51-f7188847bb86"}]}#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.652 229152 DEBUG nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.675 229152 DEBUG nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.676 229152 DEBUG nova.compute.provider_tree [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.692 229152 DEBUG nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.721 229152 DEBUG nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.764 229152 DEBUG oslo_concurrency.processutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:31 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c001340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:31 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:31 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.903 229152 DEBUG nova.compute.manager [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.904 229152 DEBUG oslo_concurrency.lockutils [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.904 229152 DEBUG oslo_concurrency.lockutils [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.904 229152 DEBUG oslo_concurrency.lockutils [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.905 229152 DEBUG nova.compute.manager [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] No waiting events found dispatching network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.905 229152 WARNING nova.compute.manager [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received unexpected event network-vif-plugged-88d90ea0-2620-45cd-9740-631b9db0c778 for instance with vm_state deleted and task_state None.#033[00m
Dec  1 05:16:31 np0005540826 nova_compute[229148]: 2025-12-01 10:16:31.905 229152 DEBUG nova.compute.manager [req-c2766ec5-88d5-4151-a137-5e6310df8b51 req-673b751d-9e6c-43f8-980f-58b61db45ff1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Received event network-vif-deleted-88d90ea0-2620-45cd-9740-631b9db0c778 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:16:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:32 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4104340304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.221 229152 DEBUG oslo_concurrency.processutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.227 229152 DEBUG nova.compute.provider_tree [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.270 229152 DEBUG nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updated inventory for provider 19014d04-db84-4f3d-831b-084720e9168c with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.271 229152 DEBUG nova.compute.provider_tree [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating resource provider 19014d04-db84-4f3d-831b-084720e9168c generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.271 229152 DEBUG nova.compute.provider_tree [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.288 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.320 229152 INFO nova.scheduler.client.report [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Deleted allocations for instance 340ca5cd-1261-42f0-9157-5cd4fad54b63#033[00m
Dec  1 05:16:32 np0005540826 nova_compute[229148]: 2025-12-01 10:16:32.388 229152 DEBUG oslo_concurrency.lockutils [None req-98244151-6cbe-47d0-8892-81cd05d611d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "340ca5cd-1261-42f0-9157-5cd4fad54b63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:32.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:33 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:33.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:33 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:33 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:33 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c002400 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:34 np0005540826 nova_compute[229148]: 2025-12-01 10:16:34.178 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:34 np0005540826 nova_compute[229148]: 2025-12-01 10:16:34.305 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:34.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:35 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:35.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:35 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:35 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:35 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:35 np0005540826 podman[233999]: 2025-12-01 10:16:35.978107285 +0000 UTC m=+0.060670194 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS)
Dec  1 05:16:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:36.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:37 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c002400 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:37.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:37 np0005540826 nova_compute[229148]: 2025-12-01 10:16:37.677 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:37 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:37 np0005540826 nova_compute[229148]: 2025-12-01 10:16:37.829 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:37 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:37 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:38.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:39 np0005540826 nova_compute[229148]: 2025-12-01 10:16:39.179 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:39 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:39.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:39 np0005540826 nova_compute[229148]: 2025-12-01 10:16:39.307 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:39 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c0091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:39 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:39 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c0091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:40.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:41 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:41.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:41 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:41 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:41 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c0091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:42.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:43 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:43.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:43 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:43 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:43 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:43 np0005540826 podman[234049]: 2025-12-01 10:16:43.970146779 +0000 UTC m=+0.108040862 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:16:44 np0005540826 nova_compute[229148]: 2025-12-01 10:16:44.180 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:44 np0005540826 nova_compute[229148]: 2025-12-01 10:16:44.279 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584189.2768433, 340ca5cd-1261-42f0-9157-5cd4fad54b63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:16:44 np0005540826 nova_compute[229148]: 2025-12-01 10:16:44.279 229152 INFO nova.compute.manager [-] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:16:44 np0005540826 nova_compute[229148]: 2025-12-01 10:16:44.299 229152 DEBUG nova.compute.manager [None req-fefae021-f513-40cd-aceb-40e1b136c789 - - - - - -] [instance: 340ca5cd-1261-42f0-9157-5cd4fad54b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:16:44 np0005540826 nova_compute[229148]: 2025-12-01 10:16:44.309 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:44.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:45 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:45.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:45 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:45 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:45 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:47 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:47.004 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:16:47 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:47.005 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:16:47 np0005540826 nova_compute[229148]: 2025-12-01 10:16:47.005 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:47 np0005540826 nova_compute[229148]: 2025-12-01 10:16:47.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:47 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:47.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:47 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:47 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:47 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:48.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.126 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.182 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:49 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:49.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:49 np0005540826 nova_compute[229148]: 2025-12-01 10:16:49.310 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:49 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:49 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:49 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:50 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:16:50.007 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:50 np0005540826 nova_compute[229148]: 2025-12-01 10:16:50.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:50 np0005540826 nova_compute[229148]: 2025-12-01 10:16:50.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:50 np0005540826 nova_compute[229148]: 2025-12-01 10:16:50.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:16:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:50.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:51 np0005540826 nova_compute[229148]: 2025-12-01 10:16:51.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:51 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:51.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:51 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:51 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:51 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb28004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:52 np0005540826 podman[234081]: 2025-12-01 10:16:52.005565486 +0000 UTC m=+0.081204151 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.148 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.148 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.148 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.148 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.149 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3552851747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.608 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:52.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.772 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.774 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4998MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.774 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.774 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.836 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.837 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:16:52 np0005540826 nova_compute[229148]: 2025-12-01 10:16:52.860 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:53 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3807992003' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:53.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:53 np0005540826 nova_compute[229148]: 2025-12-01 10:16:53.312 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:53 np0005540826 nova_compute[229148]: 2025-12-01 10:16:53.318 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:16:53 np0005540826 nova_compute[229148]: 2025-12-01 10:16:53.342 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:16:53 np0005540826 nova_compute[229148]: 2025-12-01 10:16:53.365 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:16:53 np0005540826 nova_compute[229148]: 2025-12-01 10:16:53.365 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:53 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:53 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:54 np0005540826 nova_compute[229148]: 2025-12-01 10:16:54.183 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:54 np0005540826 nova_compute[229148]: 2025-12-01 10:16:54.313 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:54 np0005540826 nova_compute[229148]: 2025-12-01 10:16:54.366 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:54.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:16:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:55.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:16:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb1c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:55 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:55 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:56.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:57.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:57 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:57 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb24001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:58.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:59 np0005540826 nova_compute[229148]: 2025-12-01 10:16:59.185 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:16:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:59.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:59 np0005540826 nova_compute[229148]: 2025-12-01 10:16:59.315 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:16:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:59 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:16:59 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.606 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.606 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.627 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.700 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.700 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.706 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.707 229152 INFO nova.compute.claims [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:17:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:00.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:00 np0005540826 nova_compute[229148]: 2025-12-01 10:17:00.814 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:17:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/545670811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.267 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.274 229152 DEBUG nova.compute.provider_tree [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:17:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.289 229152 DEBUG nova.scheduler.client.report [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.309 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.310 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:17:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:01.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.366 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.366 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.390 229152 INFO nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.409 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.526 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.528 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.528 229152 INFO nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Creating image(s)#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.556 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.587 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.616 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.620 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.644 229152 DEBUG nova.policy [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.678 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.679 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.679 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.680 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.708 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:01 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.712 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:01 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:01 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:01.999 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.065 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.171 229152 DEBUG nova.objects.instance [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid 601aa42c-17a5-4bf9-afe5-f481c5fc4648 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.192 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.193 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Ensure instance console log exists: /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.193 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.193 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.194 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:02 np0005540826 nova_compute[229148]: 2025-12-01 10:17:02.274 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Successfully created port: df689760-00c2-4efe-91db-f3bec7cc8992 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:17:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:02.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb18000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:03.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:03 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:03 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:04 np0005540826 nova_compute[229148]: 2025-12-01 10:17:04.187 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:04 np0005540826 nova_compute[229148]: 2025-12-01 10:17:04.317 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:04.548 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:04.549 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:04.549 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:17:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:04.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:17:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:05.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.645 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Successfully updated port: df689760-00c2-4efe-91db-f3bec7cc8992 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.663 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.663 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.663 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.754 229152 DEBUG nova.compute.manager [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-changed-df689760-00c2-4efe-91db-f3bec7cc8992 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.755 229152 DEBUG nova.compute.manager [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing instance network info cache due to event network-changed-df689760-00c2-4efe-91db-f3bec7cc8992. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:17:05 np0005540826 nova_compute[229148]: 2025-12-01 10:17:05.755 229152 DEBUG oslo_concurrency.lockutils [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:17:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:05 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:05 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:06 np0005540826 podman[234392]: 2025-12-01 10:17:06.46502661 +0000 UTC m=+0.060922209 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Dec  1 05:17:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:06 np0005540826 nova_compute[229148]: 2025-12-01 10:17:06.625 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:17:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:06.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:07.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:07 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:07 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:17:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:07 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.395 229152 DEBUG nova.network.neutron [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updating instance_info_cache with network_info: [{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.421 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.422 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Instance network_info: |[{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.422 229152 DEBUG oslo_concurrency.lockutils [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.423 229152 DEBUG nova.network.neutron [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing network info cache for port df689760-00c2-4efe-91db-f3bec7cc8992 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.425 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Start _get_guest_xml network_info=[{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.430 229152 WARNING nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.435 229152 DEBUG nova.virt.libvirt.host [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.436 229152 DEBUG nova.virt.libvirt.host [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.444 229152 DEBUG nova.virt.libvirt.host [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.445 229152 DEBUG nova.virt.libvirt.host [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.445 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.446 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.446 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.446 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.447 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.447 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.447 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.447 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.447 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.448 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.448 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.448 229152 DEBUG nova.virt.hardware [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.451 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:08.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:17:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2144203208' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.915 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.942 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:08 np0005540826 nova_compute[229148]: 2025-12-01 10:17:08.948 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101709 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.189 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.319 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:09.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:17:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1701952053' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.428 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.430 229152 DEBUG nova.virt.libvirt.vif [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:16:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1837747572',display_name='tempest-TestNetworkBasicOps-server-1837747572',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1837747572',id=3,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjrgX1ALH5WYnvTxG769+Xdy1tbtrks6o3UiLYtaXfVRyE0n3DLs9uvOak3ATO6GqaSLDf8DFyog39lXa9nSgA4lrJBT9//92DCbgHhZGYm8LZVBAItxiG3grm37RKP6w==',key_name='tempest-TestNetworkBasicOps-271287835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-hkny07g0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:17:01Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=601aa42c-17a5-4bf9-afe5-f481c5fc4648,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.431 229152 DEBUG nova.network.os_vif_util [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.432 229152 DEBUG nova.network.os_vif_util [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:90:a9,bridge_name='br-int',has_traffic_filtering=True,id=df689760-00c2-4efe-91db-f3bec7cc8992,network=Network(1347bb43-2e57-4582-adf2-6882693ebbc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf689760-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.433 229152 DEBUG nova.objects.instance [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 601aa42c-17a5-4bf9-afe5-f481c5fc4648 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.449 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <uuid>601aa42c-17a5-4bf9-afe5-f481c5fc4648</uuid>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <name>instance-00000003</name>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-1837747572</nova:name>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:17:08</nova:creationTime>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <nova:port uuid="df689760-00c2-4efe-91db-f3bec7cc8992">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="serial">601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="uuid">601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:ab:90:a9"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <target dev="tapdf689760-00"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log" append="off"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:17:09 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:17:09 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:17:09 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:17:09 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.451 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Preparing to wait for external event network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.451 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.452 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.452 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.453 229152 DEBUG nova.virt.libvirt.vif [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:16:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1837747572',display_name='tempest-TestNetworkBasicOps-server-1837747572',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1837747572',id=3,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjrgX1ALH5WYnvTxG769+Xdy1tbtrks6o3UiLYtaXfVRyE0n3DLs9uvOak3ATO6GqaSLDf8DFyog39lXa9nSgA4lrJBT9//92DCbgHhZGYm8LZVBAItxiG3grm37RKP6w==',key_name='tempest-TestNetworkBasicOps-271287835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-hkny07g0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:17:01Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=601aa42c-17a5-4bf9-afe5-f481c5fc4648,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.453 229152 DEBUG nova.network.os_vif_util [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.454 229152 DEBUG nova.network.os_vif_util [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:90:a9,bridge_name='br-int',has_traffic_filtering=True,id=df689760-00c2-4efe-91db-f3bec7cc8992,network=Network(1347bb43-2e57-4582-adf2-6882693ebbc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf689760-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.454 229152 DEBUG os_vif [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:90:a9,bridge_name='br-int',has_traffic_filtering=True,id=df689760-00c2-4efe-91db-f3bec7cc8992,network=Network(1347bb43-2e57-4582-adf2-6882693ebbc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf689760-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.455 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.455 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.456 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.460 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.460 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf689760-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.461 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf689760-00, col_values=(('external_ids', {'iface-id': 'df689760-00c2-4efe-91db-f3bec7cc8992', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:90:a9', 'vm-uuid': '601aa42c-17a5-4bf9-afe5-f481c5fc4648'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.462 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 NetworkManager[48989]: <info>  [1764584229.4638] manager: (tapdf689760-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.465 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.469 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.470 229152 INFO os_vif [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:90:a9,bridge_name='br-int',has_traffic_filtering=True,id=df689760-00c2-4efe-91db-f3bec7cc8992,network=Network(1347bb43-2e57-4582-adf2-6882693ebbc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf689760-00')#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.519 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.519 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.520 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:ab:90:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.520 229152 INFO nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Using config drive#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.549 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:09 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:09 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.928 229152 INFO nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Creating config drive at /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.933 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv8e31l78 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.984 229152 DEBUG nova.network.neutron [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updated VIF entry in instance network info cache for port df689760-00c2-4efe-91db-f3bec7cc8992. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:17:09 np0005540826 nova_compute[229148]: 2025-12-01 10:17:09.985 229152 DEBUG nova.network.neutron [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updating instance_info_cache with network_info: [{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.000 229152 DEBUG oslo_concurrency.lockutils [req-c6676583-d4ce-4b9e-b2c6-09edfac70143 req-c29f7ae8-1c40-45d9-abc2-de6034291136 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.064 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv8e31l78" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.094 229152 DEBUG nova.storage.rbd_utils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.099 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.569 229152 DEBUG oslo_concurrency.processutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config 601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.570 229152 INFO nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Deleting local config drive /var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/disk.config because it was imported into RBD.#033[00m
Dec  1 05:17:10 np0005540826 kernel: tapdf689760-00: entered promiscuous mode
Dec  1 05:17:10 np0005540826 NetworkManager[48989]: <info>  [1764584230.6273] manager: (tapdf689760-00): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec  1 05:17:10 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:10Z|00036|binding|INFO|Claiming lport df689760-00c2-4efe-91db-f3bec7cc8992 for this chassis.
Dec  1 05:17:10 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:10Z|00037|binding|INFO|df689760-00c2-4efe-91db-f3bec7cc8992: Claiming fa:16:3e:ab:90:a9 10.100.0.13
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.629 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.633 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.644 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:90:a9 10.100.0.13'], port_security=['fa:16:3e:ab:90:a9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '601aa42c-17a5-4bf9-afe5-f481c5fc4648', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1347bb43-2e57-4582-adf2-6882693ebbc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1eb1b69c-9189-462c-af94-79ddda78cdfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fc9735e-bce6-4db5-ad63-650e462a375c, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=df689760-00c2-4efe-91db-f3bec7cc8992) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.645 141685 INFO neutron.agent.ovn.metadata.agent [-] Port df689760-00c2-4efe-91db-f3bec7cc8992 in datapath 1347bb43-2e57-4582-adf2-6882693ebbc2 bound to our chassis#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.646 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1347bb43-2e57-4582-adf2-6882693ebbc2#033[00m
Dec  1 05:17:10 np0005540826 systemd-udevd[234606]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.660 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a5003086-ffdd-43ba-813f-31994b4767a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.662 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1347bb43-21 in ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.663 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1347bb43-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.663 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8779f847-87e0-4871-b477-64a40589c6e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.665 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9378147b-5601-4f6a-b376-d58be6fc353e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 systemd-machined[192474]: New machine qemu-2-instance-00000003.
Dec  1 05:17:10 np0005540826 NetworkManager[48989]: <info>  [1764584230.6790] device (tapdf689760-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:17:10 np0005540826 NetworkManager[48989]: <info>  [1764584230.6803] device (tapdf689760-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:17:10 np0005540826 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.688 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[0a83f0b7-2271-438e-a785-ea1a187626da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.706 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.709 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[54f461bd-0b90-4b00-b7a3-d5c4907ef36c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:10Z|00038|binding|INFO|Setting lport df689760-00c2-4efe-91db-f3bec7cc8992 ovn-installed in OVS
Dec  1 05:17:10 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:10Z|00039|binding|INFO|Setting lport df689760-00c2-4efe-91db-f3bec7cc8992 up in Southbound
Dec  1 05:17:10 np0005540826 nova_compute[229148]: 2025-12-01 10:17:10.718 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:10.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.745 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[656dcbfe-8c09-4c88-91f9-aeb38e9acc91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 NetworkManager[48989]: <info>  [1764584230.7535] manager: (tap1347bb43-20): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.753 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a784f928-1073-4e9b-96e3-d4f19e469b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.783 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[947ea755-2e62-41b1-af2b-d87a8ef8655f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.786 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[1def2492-8d02-48ed-bf02-bf27e7c11894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 NetworkManager[48989]: <info>  [1764584230.8564] device (tap1347bb43-20): carrier: link connected
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.861 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[a962ae5c-cfc9-4042-b371-57da1f32d4f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.881 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[d0361d73-272b-44b3-bcf0-48370965a82d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1347bb43-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:93:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409337, 'reachable_time': 41171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234642, 'error': None, 'target': 'ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.900 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[0b94cec0-4f67-44e7-9fbf-0e5bb471ed6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:93b6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 409337, 'tstamp': 409337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234643, 'error': None, 'target': 'ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.917 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[7bea9144-6e1d-4833-a140-a3cd6abbd30c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1347bb43-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:93:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 409337, 'reachable_time': 41171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234644, 'error': None, 'target': 'ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:10.962 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a939be59-8cc4-45d6-b839-8b7a78861c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.026 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cc3ff8-f2c2-4fc2-9ed9-7088740d780f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.027 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1347bb43-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.028 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.028 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1347bb43-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.030 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:11 np0005540826 NetworkManager[48989]: <info>  [1764584231.0330] manager: (tap1347bb43-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec  1 05:17:11 np0005540826 kernel: tap1347bb43-20: entered promiscuous mode
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.036 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.036 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1347bb43-20, col_values=(('external_ids', {'iface-id': 'd0dfa36f-de16-42c9-859a-979e8652d555'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:11 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:11Z|00040|binding|INFO|Releasing lport d0dfa36f-de16-42c9-859a-979e8652d555 from this chassis (sb_readonly=0)
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.038 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.053 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1347bb43-2e57-4582-adf2-6882693ebbc2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1347bb43-2e57-4582-adf2-6882693ebbc2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.053 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.054 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1e7a95-8f50-4749-a686-564236850dd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.055 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-1347bb43-2e57-4582-adf2-6882693ebbc2
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/1347bb43-2e57-4582-adf2-6882693ebbc2.pid.haproxy
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 1347bb43-2e57-4582-adf2-6882693ebbc2
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:17:11 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:11.056 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2', 'env', 'PROCESS_TAG=haproxy-1347bb43-2e57-4582-adf2-6882693ebbc2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1347bb43-2e57-4582-adf2-6882693ebbc2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:17:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb180016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:11.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:11 np0005540826 podman[234676]: 2025-12-01 10:17:11.438178268 +0000 UTC m=+0.049011160 container create 2ad0b2127bfca0f769b9a2f4e3baf5fef87a69c0253c96ad78f3aebad3137b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  1 05:17:11 np0005540826 systemd[1]: Started libpod-conmon-2ad0b2127bfca0f769b9a2f4e3baf5fef87a69c0253c96ad78f3aebad3137b16.scope.
Dec  1 05:17:11 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:17:11 np0005540826 podman[234676]: 2025-12-01 10:17:11.412924805 +0000 UTC m=+0.023757717 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:17:11 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5bb5d4fe217e6d0e830dd933b1d19fdec28f041e3dc1a4ce8342099af2d1b25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:17:11 np0005540826 podman[234676]: 2025-12-01 10:17:11.530209351 +0000 UTC m=+0.141042263 container init 2ad0b2127bfca0f769b9a2f4e3baf5fef87a69c0253c96ad78f3aebad3137b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  1 05:17:11 np0005540826 podman[234676]: 2025-12-01 10:17:11.536567546 +0000 UTC m=+0.147400438 container start 2ad0b2127bfca0f769b9a2f4e3baf5fef87a69c0253c96ad78f3aebad3137b16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:17:11 np0005540826 neutron-haproxy-ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2[234691]: [NOTICE]   (234695) : New worker (234697) forked
Dec  1 05:17:11 np0005540826 neutron-haproxy-ovnmeta-1347bb43-2e57-4582-adf2-6882693ebbc2[234691]: [NOTICE]   (234695) : Loading success.
Dec  1 05:17:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:11 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:11 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.976 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584231.9759457, 601aa42c-17a5-4bf9-afe5-f481c5fc4648 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.976 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] VM Started (Lifecycle Event)#033[00m
Dec  1 05:17:11 np0005540826 nova_compute[229148]: 2025-12-01 10:17:11.997 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.001 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584231.978457, 601aa42c-17a5-4bf9-afe5-f481c5fc4648 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.002 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.023 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.026 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.050 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:17:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:12.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.804 229152 DEBUG nova.compute.manager [req-d8f43708-9478-4042-b347-18c7ddf71b00 req-238ad3ee-d00f-431a-ac88-0066a1114a23 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.805 229152 DEBUG oslo_concurrency.lockutils [req-d8f43708-9478-4042-b347-18c7ddf71b00 req-238ad3ee-d00f-431a-ac88-0066a1114a23 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.805 229152 DEBUG oslo_concurrency.lockutils [req-d8f43708-9478-4042-b347-18c7ddf71b00 req-238ad3ee-d00f-431a-ac88-0066a1114a23 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.806 229152 DEBUG oslo_concurrency.lockutils [req-d8f43708-9478-4042-b347-18c7ddf71b00 req-238ad3ee-d00f-431a-ac88-0066a1114a23 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.806 229152 DEBUG nova.compute.manager [req-d8f43708-9478-4042-b347-18c7ddf71b00 req-238ad3ee-d00f-431a-ac88-0066a1114a23 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Processing event network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.807 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.810 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584232.8103485, 601aa42c-17a5-4bf9-afe5-f481c5fc4648 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.810 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.812 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.815 229152 INFO nova.virt.libvirt.driver [-] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Instance spawned successfully.#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.816 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.834 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.839 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.839 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.840 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.840 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.841 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.841 229152 DEBUG nova.virt.libvirt.driver [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.845 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.875 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.904 229152 INFO nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Took 11.38 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.905 229152 DEBUG nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.965 229152 INFO nova.compute.manager [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Took 12.30 seconds to build instance.#033[00m
Dec  1 05:17:12 np0005540826 nova_compute[229148]: 2025-12-01 10:17:12.984 229152 DEBUG oslo_concurrency.lockutils [None req-080f26e5-2868-4043-a6c4-709883c54ad4 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:13.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb18002050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:13 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:13 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.236 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:14 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.463 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:14.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.889 229152 DEBUG nova.compute.manager [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.889 229152 DEBUG oslo_concurrency.lockutils [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.890 229152 DEBUG oslo_concurrency.lockutils [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.890 229152 DEBUG oslo_concurrency.lockutils [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.890 229152 DEBUG nova.compute.manager [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] No waiting events found dispatching network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:17:14 np0005540826 nova_compute[229148]: 2025-12-01 10:17:14.890 229152 WARNING nova.compute.manager [req-1d35c9a9-c391-468a-b237-481913ee72fb req-d56d6918-4889-4dce-96ee-c8909d3a3c94 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received unexpected event network-vif-plugged-df689760-00c2-4efe-91db-f3bec7cc8992 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:17:15 np0005540826 podman[234776]: 2025-12-01 10:17:15.003906434 +0000 UTC m=+0.087887443 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  1 05:17:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:15.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:15 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:15 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb18002050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:16.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:16 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:16Z|00041|binding|INFO|Releasing lport d0dfa36f-de16-42c9-859a-979e8652d555 from this chassis (sb_readonly=0)
Dec  1 05:17:16 np0005540826 nova_compute[229148]: 2025-12-01 10:17:16.914 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:16 np0005540826 NetworkManager[48989]: <info>  [1764584236.9181] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec  1 05:17:16 np0005540826 NetworkManager[48989]: <info>  [1764584236.9195] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec  1 05:17:16 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:16Z|00042|binding|INFO|Releasing lport d0dfa36f-de16-42c9-859a-979e8652d555 from this chassis (sb_readonly=0)
Dec  1 05:17:16 np0005540826 nova_compute[229148]: 2025-12-01 10:17:16.939 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:16 np0005540826 nova_compute[229148]: 2025-12-01 10:17:16.946 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:17 np0005540826 nova_compute[229148]: 2025-12-01 10:17:17.167 229152 DEBUG nova.compute.manager [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-changed-df689760-00c2-4efe-91db-f3bec7cc8992 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:17 np0005540826 nova_compute[229148]: 2025-12-01 10:17:17.167 229152 DEBUG nova.compute.manager [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing instance network info cache due to event network-changed-df689760-00c2-4efe-91db-f3bec7cc8992. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:17:17 np0005540826 nova_compute[229148]: 2025-12-01 10:17:17.168 229152 DEBUG oslo_concurrency.lockutils [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:17:17 np0005540826 nova_compute[229148]: 2025-12-01 10:17:17.168 229152 DEBUG oslo_concurrency.lockutils [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:17:17 np0005540826 nova_compute[229148]: 2025-12-01 10:17:17.168 229152 DEBUG nova.network.neutron [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing network info cache for port df689760-00c2-4efe-91db-f3bec7cc8992 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:17:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:17.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:17 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:17 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb30004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:18 np0005540826 nova_compute[229148]: 2025-12-01 10:17:18.314 229152 DEBUG nova.network.neutron [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updated VIF entry in instance network info cache for port df689760-00c2-4efe-91db-f3bec7cc8992. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:17:18 np0005540826 nova_compute[229148]: 2025-12-01 10:17:18.315 229152 DEBUG nova.network.neutron [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updating instance_info_cache with network_info: [{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:17:18 np0005540826 nova_compute[229148]: 2025-12-01 10:17:18.331 229152 DEBUG oslo_concurrency.lockutils [req-90fbb61f-bfbd-41cc-ac50-0a188fcf0c69 req-67babda3-d121-4d0b-b7df-2ee746288066 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:17:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:18.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:19 np0005540826 nova_compute[229148]: 2025-12-01 10:17:19.237 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb18002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:19.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:19 np0005540826 nova_compute[229148]: 2025-12-01 10:17:19.464 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb4c009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:19 np0005540826 kernel: ganesha.nfsd[233060]: segfault at 50 ip 00007fbbfb85c32e sp 00007fbbc5ffa210 error 4 in libntirpc.so.5.8[7fbbfb841000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 05:17:19 np0005540826 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:17:19 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc[232895]: 01/12/2025 10:17:19 : epoch 692d6acc : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbb40003cc0 fd 39 proxy ignored for local
Dec  1 05:17:19 np0005540826 systemd[1]: Started Process Core Dump (PID 234803/UID 0).
Dec  1 05:17:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:17:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:20.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:17:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:21.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:22 np0005540826 systemd-coredump[234804]: Process 232899 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007fbbfb85c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:17:22 np0005540826 systemd[1]: systemd-coredump@14-234803-0.service: Deactivated successfully.
Dec  1 05:17:22 np0005540826 systemd[1]: systemd-coredump@14-234803-0.service: Consumed 2.311s CPU time.
Dec  1 05:17:22 np0005540826 podman[234812]: 2025-12-01 10:17:22.426317195 +0000 UTC m=+0.026520501 container died 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1)
Dec  1 05:17:22 np0005540826 systemd[1]: var-lib-containers-storage-overlay-895194d5328c9f84f7196a01204a02006bbf476e63b8591e08eccaf668a19aa5-merged.mount: Deactivated successfully.
Dec  1 05:17:22 np0005540826 podman[234811]: 2025-12-01 10:17:22.464539064 +0000 UTC m=+0.061638291 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  1 05:17:22 np0005540826 podman[234812]: 2025-12-01 10:17:22.473988486 +0000 UTC m=+0.074191792 container remove 5c2a0c3e84dee36c8ff9eefbeac4395d62f026409e74501ec2627429fc6e7beb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-0-0-compute-1-osfnzc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:17:22 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:17:22 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:17:22 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.569s CPU time.
Dec  1 05:17:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:17:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:22.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:17:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:17:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:23.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:17:24 np0005540826 nova_compute[229148]: 2025-12-01 10:17:24.239 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:24 np0005540826 nova_compute[229148]: 2025-12-01 10:17:24.466 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:17:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:24.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:17:25 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:25Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:90:a9 10.100.0.13
Dec  1 05:17:25 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:25Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:90:a9 10.100.0.13
Dec  1 05:17:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:25.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:17:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:26.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:17:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:27.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101727 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [WARNING] 334/101727 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [NOTICE] 334/101727 (4) : haproxy version is 2.3.17-d1c9119
Dec  1 05:17:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [NOTICE] 334/101727 (4) : path to executable is /usr/local/sbin/haproxy
Dec  1 05:17:27 np0005540826 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis[85207]: [ALERT] 334/101727 (4) : backend 'backend' has no server available!
Dec  1 05:17:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:28.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:29 np0005540826 nova_compute[229148]: 2025-12-01 10:17:29.241 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:29.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:29 np0005540826 nova_compute[229148]: 2025-12-01 10:17:29.468 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:30.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:31.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:31 np0005540826 nova_compute[229148]: 2025-12-01 10:17:31.617 229152 INFO nova.compute.manager [None req-d615c470-04f8-47bd-b037-c1fb268196d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Get console output#033[00m
Dec  1 05:17:31 np0005540826 nova_compute[229148]: 2025-12-01 10:17:31.623 229152 INFO oslo.privsep.daemon [None req-d615c470-04f8-47bd-b037-c1fb268196d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpf61mslhv/privsep.sock']#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.370 229152 INFO oslo.privsep.daemon [None req-d615c470-04f8-47bd-b037-c1fb268196d6 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.206 234904 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.210 234904 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.213 234904 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.213 234904 INFO oslo.privsep.daemon [-] privsep daemon running as pid 234904#033[00m
Dec  1 05:17:32 np0005540826 nova_compute[229148]: 2025-12-01 10:17:32.478 234904 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  1 05:17:32 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Scheduled restart job, restart counter is at 15.
Dec  1 05:17:32 np0005540826 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:17:32 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Consumed 1.569s CPU time.
Dec  1 05:17:32 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Start request repeated too quickly.
Dec  1 05:17:32 np0005540826 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.0.0.compute-1.osfnzc.service: Failed with result 'exit-code'.
Dec  1 05:17:32 np0005540826 systemd[1]: Failed to start Ceph nfs.cephfs.0.0.compute-1.osfnzc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:17:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:33.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:34 np0005540826 nova_compute[229148]: 2025-12-01 10:17:34.243 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:34 np0005540826 nova_compute[229148]: 2025-12-01 10:17:34.470 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:17:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:34.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:17:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:35.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:36.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:36 np0005540826 podman[234908]: 2025-12-01 10:17:36.988030917 +0000 UTC m=+0.067506392 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:17:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:37.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:37 np0005540826 nova_compute[229148]: 2025-12-01 10:17:37.800 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "interface-601aa42c-17a5-4bf9-afe5-f481c5fc4648-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:37 np0005540826 nova_compute[229148]: 2025-12-01 10:17:37.801 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-601aa42c-17a5-4bf9-afe5-f481c5fc4648-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:37 np0005540826 nova_compute[229148]: 2025-12-01 10:17:37.801 229152 DEBUG nova.objects.instance [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'flavor' on Instance uuid 601aa42c-17a5-4bf9-afe5-f481c5fc4648 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:17:38 np0005540826 nova_compute[229148]: 2025-12-01 10:17:38.295 229152 DEBUG nova.objects.instance [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_requests' on Instance uuid 601aa42c-17a5-4bf9-afe5-f481c5fc4648 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:17:38 np0005540826 nova_compute[229148]: 2025-12-01 10:17:38.309 229152 DEBUG nova.network.neutron [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:17:38 np0005540826 nova_compute[229148]: 2025-12-01 10:17:38.672 229152 DEBUG nova.policy [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:17:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:38.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.128 229152 DEBUG nova.network.neutron [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Successfully created port: 347c1cee-581a-4693-b712-a2da5dba7d23 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.246 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:17:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:39.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.472 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.896 229152 DEBUG nova.network.neutron [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Successfully updated port: 347c1cee-581a-4693-b712-a2da5dba7d23 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.915 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.916 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:17:39 np0005540826 nova_compute[229148]: 2025-12-01 10:17:39.916 229152 DEBUG nova.network.neutron [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:17:40 np0005540826 nova_compute[229148]: 2025-12-01 10:17:40.005 229152 DEBUG nova.compute.manager [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-changed-347c1cee-581a-4693-b712-a2da5dba7d23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:40 np0005540826 nova_compute[229148]: 2025-12-01 10:17:40.006 229152 DEBUG nova.compute.manager [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing instance network info cache due to event network-changed-347c1cee-581a-4693-b712-a2da5dba7d23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:17:40 np0005540826 nova_compute[229148]: 2025-12-01 10:17:40.006 229152 DEBUG oslo_concurrency.lockutils [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:17:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:17:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:40.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:17:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:41.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.405 229152 DEBUG nova.network.neutron [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updating instance_info_cache with network_info: [{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.428 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.429 229152 DEBUG oslo_concurrency.lockutils [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.429 229152 DEBUG nova.network.neutron [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Refreshing network info cache for port 347c1cee-581a-4693-b712-a2da5dba7d23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.432 229152 DEBUG nova.virt.libvirt.vif [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:16:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1837747572',display_name='tempest-TestNetworkBasicOps-server-1837747572',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1837747572',id=3,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjrgX1ALH5WYnvTxG769+Xdy1tbtrks6o3UiLYtaXfVRyE0n3DLs9uvOak3ATO6GqaSLDf8DFyog39lXa9nSgA4lrJBT9//92DCbgHhZGYm8LZVBAItxiG3grm37RKP6w==',key_name='tempest-TestNetworkBasicOps-271287835',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:17:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-hkny07g0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:17:12Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=601aa42c-17a5-4bf9-afe5-f481c5fc4648,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.432 229152 DEBUG nova.network.os_vif_util [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.433 229152 DEBUG nova.network.os_vif_util [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:af:6f,bridge_name='br-int',has_traffic_filtering=True,id=347c1cee-581a-4693-b712-a2da5dba7d23,network=Network(f886aaab-3794-46e5-a7e6-c1b86b6b50cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap347c1cee-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.433 229152 DEBUG os_vif [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:af:6f,bridge_name='br-int',has_traffic_filtering=True,id=347c1cee-581a-4693-b712-a2da5dba7d23,network=Network(f886aaab-3794-46e5-a7e6-c1b86b6b50cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap347c1cee-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.434 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.434 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.435 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.438 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.438 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap347c1cee-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.439 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap347c1cee-58, col_values=(('external_ids', {'iface-id': '347c1cee-581a-4693-b712-a2da5dba7d23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:af:6f', 'vm-uuid': '601aa42c-17a5-4bf9-afe5-f481c5fc4648'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.440 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.4417] manager: (tap347c1cee-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.445 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.449 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.451 229152 INFO os_vif [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:af:6f,bridge_name='br-int',has_traffic_filtering=True,id=347c1cee-581a-4693-b712-a2da5dba7d23,network=Network(f886aaab-3794-46e5-a7e6-c1b86b6b50cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap347c1cee-58')#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.451 229152 DEBUG nova.virt.libvirt.vif [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:16:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1837747572',display_name='tempest-TestNetworkBasicOps-server-1837747572',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1837747572',id=3,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjrgX1ALH5WYnvTxG769+Xdy1tbtrks6o3UiLYtaXfVRyE0n3DLs9uvOak3ATO6GqaSLDf8DFyog39lXa9nSgA4lrJBT9//92DCbgHhZGYm8LZVBAItxiG3grm37RKP6w==',key_name='tempest-TestNetworkBasicOps-271287835',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:17:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-hkny07g0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:17:12Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=601aa42c-17a5-4bf9-afe5-f481c5fc4648,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.452 229152 DEBUG nova.network.os_vif_util [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.452 229152 DEBUG nova.network.os_vif_util [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:af:6f,bridge_name='br-int',has_traffic_filtering=True,id=347c1cee-581a-4693-b712-a2da5dba7d23,network=Network(f886aaab-3794-46e5-a7e6-c1b86b6b50cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap347c1cee-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.456 229152 DEBUG nova.virt.libvirt.guest [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] attach device xml: <interface type="ethernet">
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:21:af:6f"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <target dev="tap347c1cee-58"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:17:41 np0005540826 nova_compute[229148]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  1 05:17:41 np0005540826 kernel: tap347c1cee-58: entered promiscuous mode
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.4681] manager: (tap347c1cee-58): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.470 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:41Z|00043|binding|INFO|Claiming lport 347c1cee-581a-4693-b712-a2da5dba7d23 for this chassis.
Dec  1 05:17:41 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:41Z|00044|binding|INFO|347c1cee-581a-4693-b712-a2da5dba7d23: Claiming fa:16:3e:21:af:6f 10.100.0.20
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.482 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:af:6f 10.100.0.20'], port_security=['fa:16:3e:21:af:6f 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '601aa42c-17a5-4bf9-afe5-f481c5fc4648', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '11501149-732d-4202-ad97-ece49baad0dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7f7415-fd59-4260-bb1c-2b5395bc38fb, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=347c1cee-581a-4693-b712-a2da5dba7d23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.484 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 347c1cee-581a-4693-b712-a2da5dba7d23 in datapath f886aaab-3794-46e5-a7e6-c1b86b6b50cc bound to our chassis#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.485 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f886aaab-3794-46e5-a7e6-c1b86b6b50cc#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.502 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b60ba1ca-0771-432d-a72e-b4d86dff4c26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.503 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf886aaab-31 in ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:17:41 np0005540826 systemd-udevd[234936]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.505 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf886aaab-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.505 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[e13c1524-9c72-4eae-84b8-22efedb133ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.507 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9a57c365-68a0-4d6b-ac85-2ef2040764be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.513 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:41Z|00045|binding|INFO|Setting lport 347c1cee-581a-4693-b712-a2da5dba7d23 ovn-installed in OVS
Dec  1 05:17:41 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:41Z|00046|binding|INFO|Setting lport 347c1cee-581a-4693-b712-a2da5dba7d23 up in Southbound
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.518 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.5212] device (tap347c1cee-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.5225] device (tap347c1cee-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.524 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[80a5cafd-4564-484d-9ef5-e38c57d7896c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.550 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[426f576f-4ccf-4aa9-8391-d8ae1e869063]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.560 229152 DEBUG nova.virt.libvirt.driver [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.560 229152 DEBUG nova.virt.libvirt.driver [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.561 229152 DEBUG nova.virt.libvirt.driver [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:ab:90:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.561 229152 DEBUG nova.virt.libvirt.driver [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:21:af:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.581 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[480b7952-bf62-4ef2-bbc5-ccdecbef0a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.585 229152 DEBUG nova.virt.libvirt.guest [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1837747572</nova:name>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:17:41</nova:creationTime>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:port uuid="df689760-00c2-4efe-91db-f3bec7cc8992">
Dec  1 05:17:41 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    <nova:port uuid="347c1cee-581a-4693-b712-a2da5dba7d23">
Dec  1 05:17:41 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:41 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:17:41 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:17:41 np0005540826 nova_compute[229148]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.589 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ca711a51-de5a-44af-b590-7c5fdea8a9d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.5906] manager: (tapf886aaab-30): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec  1 05:17:41 np0005540826 systemd-udevd[234939]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.614 229152 DEBUG oslo_concurrency.lockutils [None req-3b3583ae-940e-4acd-a9cd-8bcc66273c95 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-601aa42c-17a5-4bf9-afe5-f481c5fc4648-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 3.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.620 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6ade83-674d-41f3-86b9-29c5f0060758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.623 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ba1bf7-ba77-4958-9f2e-b7320437aa80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.6474] device (tapf886aaab-30): carrier: link connected
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.651 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3e2280-f156-411c-bbba-4c6f8bfe9ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.670 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[913e592a-7f79-495f-841e-ce4c02b56308]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf886aaab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:e9:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412416, 'reachable_time': 40781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234964, 'error': None, 'target': 'ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.685 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[abf88f46-4a77-4e04-a7ff-5ff76465df98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8a:e975'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412416, 'tstamp': 412416}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234965, 'error': None, 'target': 'ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.700 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[95473b26-cede-44a4-85a0-b1ab7d7fb525]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf886aaab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:e9:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412416, 'reachable_time': 40781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234966, 'error': None, 'target': 'ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.727 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9d315678-8fd5-42bb-bec4-5c5426d1cf51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.784 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff66faf-c851-4703-aeea-5c225bec8652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.786 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf886aaab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.786 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.787 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf886aaab-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.788 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 kernel: tapf886aaab-30: entered promiscuous mode
Dec  1 05:17:41 np0005540826 NetworkManager[48989]: <info>  [1764584261.7897] manager: (tapf886aaab-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.791 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.792 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf886aaab-30, col_values=(('external_ids', {'iface-id': '8d72689a-165c-4822-89f8-d12e2cc9954c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.793 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:41Z|00047|binding|INFO|Releasing lport 8d72689a-165c-4822-89f8-d12e2cc9954c from this chassis (sb_readonly=0)
Dec  1 05:17:41 np0005540826 nova_compute[229148]: 2025-12-01 10:17:41.808 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.809 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f886aaab-3794-46e5-a7e6-c1b86b6b50cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f886aaab-3794-46e5-a7e6-c1b86b6b50cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.809 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[372ea3f4-cda6-490c-acb5-f627c495df01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.810 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-f886aaab-3794-46e5-a7e6-c1b86b6b50cc
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/f886aaab-3794-46e5-a7e6-c1b86b6b50cc.pid.haproxy
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID f886aaab-3794-46e5-a7e6-c1b86b6b50cc
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:17:41 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:41.811 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'env', 'PROCESS_TAG=haproxy-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f886aaab-3794-46e5-a7e6-c1b86b6b50cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.121 229152 DEBUG nova.compute.manager [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received event network-vif-plugged-347c1cee-581a-4693-b712-a2da5dba7d23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.121 229152 DEBUG oslo_concurrency.lockutils [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.121 229152 DEBUG oslo_concurrency.lockutils [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.122 229152 DEBUG oslo_concurrency.lockutils [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "601aa42c-17a5-4bf9-afe5-f481c5fc4648-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.122 229152 DEBUG nova.compute.manager [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] No waiting events found dispatching network-vif-plugged-347c1cee-581a-4693-b712-a2da5dba7d23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.122 229152 WARNING nova.compute.manager [req-dfc3c4e6-cabf-481f-8765-9277b58d9f27 req-81e53d3b-b2b9-4c17-9510-fd1e210ccaaf dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Received unexpected event network-vif-plugged-347c1cee-581a-4693-b712-a2da5dba7d23 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:17:42 np0005540826 podman[234998]: 2025-12-01 10:17:42.177143256 +0000 UTC m=+0.056674464 container create 40291fda591174279fe01e2b90a5f36e66415c44ef14e4d2ae07bb06f154dc92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:17:42 np0005540826 systemd[1]: Started libpod-conmon-40291fda591174279fe01e2b90a5f36e66415c44ef14e4d2ae07bb06f154dc92.scope.
Dec  1 05:17:42 np0005540826 podman[234998]: 2025-12-01 10:17:42.145962056 +0000 UTC m=+0.025493284 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:17:42 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:17:42 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fce5398067cdec27c2cb9708fc75a47eecde9ab14238063473be4f10bcde09/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:17:42 np0005540826 podman[234998]: 2025-12-01 10:17:42.272939482 +0000 UTC m=+0.152470710 container init 40291fda591174279fe01e2b90a5f36e66415c44ef14e4d2ae07bb06f154dc92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  1 05:17:42 np0005540826 podman[234998]: 2025-12-01 10:17:42.27833683 +0000 UTC m=+0.157868038 container start 40291fda591174279fe01e2b90a5f36e66415c44ef14e4d2ae07bb06f154dc92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  1 05:17:42 np0005540826 neutron-haproxy-ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc[235014]: [NOTICE]   (235018) : New worker (235020) forked
Dec  1 05:17:42 np0005540826 neutron-haproxy-ovnmeta-f886aaab-3794-46e5-a7e6-c1b86b6b50cc[235014]: [NOTICE]   (235018) : Loading success.
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.540 229152 DEBUG nova.network.neutron [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updated VIF entry in instance network info cache for port 347c1cee-581a-4693-b712-a2da5dba7d23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.541 229152 DEBUG nova.network.neutron [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 601aa42c-17a5-4bf9-afe5-f481c5fc4648] Updating instance_info_cache with network_info: [{"id": "df689760-00c2-4efe-91db-f3bec7cc8992", "address": "fa:16:3e:ab:90:a9", "network": {"id": "1347bb43-2e57-4582-adf2-6882693ebbc2", "bridge": "br-int", "label": "tempest-network-smoke--629227178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf689760-00", "ovs_interfaceid": "df689760-00c2-4efe-91db-f3bec7cc8992", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:17:42 np0005540826 nova_compute[229148]: 2025-12-01 10:17:42.560 229152 DEBUG oslo_concurrency.lockutils [req-001ca431-b4b4-4d43-8956-efc8cce455e1 req-4b9c1834-fe2b-4ee7-b1d1-b875e896a5d7 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-601aa42c-17a5-4bf9-afe5-f481c5fc4648" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:17:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:17:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:17:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:42.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.000 229152 DEBUG oslo_concurrency.lockutils [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "interface-601aa42c-17a5-4bf9-afe5-f481c5fc4648-347c1cee-581a-4693-b712-a2da5dba7d23" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.002 229152 DEBUG oslo_concurrency.lockutils [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-601aa42c-17a5-4bf9-afe5-f481c5fc4648-347c1cee-581a-4693-b712-a2da5dba7d23" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.027 229152 DEBUG nova.objects.instance [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'flavor' on Instance uuid 601aa42c-17a5-4bf9-afe5-f481c5fc4648 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.053 229152 DEBUG nova.virt.libvirt.vif [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:16:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1837747572',display_name='tempest-TestNetworkBasicOps-server-1837747572',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1837747572',id=3,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjrgX1ALH5WYnvTxG769+Xdy1tbtrks6o3UiLYtaXfVRyE0n3DLs9uvOak3ATO6GqaSLDf8DFyog39lXa9nSgA4lrJBT9//92DCbgHhZGYm8LZVBAItxiG3grm37RKP6w==',key_name='tempest-TestNetworkBasicOps-271287835',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:17:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-hkny07g0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:17:12Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=601aa42c-17a5-4bf9-afe5-f481c5fc4648,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.053 229152 DEBUG nova.network.os_vif_util [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "347c1cee-581a-4693-b712-a2da5dba7d23", "address": "fa:16:3e:21:af:6f", "network": {"id": "f886aaab-3794-46e5-a7e6-c1b86b6b50cc", "bridge": "br-int", "label": "tempest-network-smoke--586706688", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap347c1cee-58", "ovs_interfaceid": "347c1cee-581a-4693-b712-a2da5dba7d23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.054 229152 DEBUG nova.network.os_vif_util [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:af:6f,bridge_name='br-int',has_traffic_filtering=True,id=347c1cee-581a-4693-b712-a2da5dba7d23,network=Network(f886aaab-3794-46e5-a7e6-c1b86b6b50cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap347c1cee-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.058 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.060 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.062 229152 DEBUG nova.virt.libvirt.driver [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Attempting to detach device tap347c1cee-58 from instance 601aa42c-17a5-4bf9-afe5-f481c5fc4648 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.063 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] detach device xml: <interface type="ethernet">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:21:af:6f"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <target dev="tap347c1cee-58"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.073 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.076 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface>not found in domain: <domain type='kvm' id='2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <name>instance-00000003</name>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <uuid>601aa42c-17a5-4bf9-afe5-f481c5fc4648</uuid>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1837747572</nova:name>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:17:41</nova:creationTime>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:port uuid="df689760-00c2-4efe-91db-f3bec7cc8992">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:port uuid="347c1cee-581a-4693-b712-a2da5dba7d23">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='serial'>601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='uuid'>601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk' index='2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config' index='1'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:ab:90:a9'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='tapdf689760-00'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:21:af:6f'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='tap347c1cee-58'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='net1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log' append='off'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log' append='off'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <rng model='virtio'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <backend model='random'>/dev/urandom</backend>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='rng0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <label>system_u:system_r:svirt_t:s0:c146,c605</label>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c146,c605</imagelabel>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <label>+107:+107</label>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <imagelabel>+107:+107</imagelabel>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.077 229152 INFO nova.virt.libvirt.driver [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully detached device tap347c1cee-58 from instance 601aa42c-17a5-4bf9-afe5-f481c5fc4648 from the persistent domain config.#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.078 229152 DEBUG nova.virt.libvirt.driver [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] (1/8): Attempting to detach device tap347c1cee-58 with device alias net1 from instance 601aa42c-17a5-4bf9-afe5-f481c5fc4648 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.079 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] detach device xml: <interface type="ethernet">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:21:af:6f"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <target dev="tap347c1cee-58"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  1 05:17:43 np0005540826 kernel: tap347c1cee-58 (unregistering): left promiscuous mode
Dec  1 05:17:43 np0005540826 NetworkManager[48989]: <info>  [1764584263.1852] device (tap347c1cee-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:17:43 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:43Z|00048|binding|INFO|Releasing lport 347c1cee-581a-4693-b712-a2da5dba7d23 from this chassis (sb_readonly=0)
Dec  1 05:17:43 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:43Z|00049|binding|INFO|Setting lport 347c1cee-581a-4693-b712-a2da5dba7d23 down in Southbound
Dec  1 05:17:43 np0005540826 ovn_controller[132309]: 2025-12-01T10:17:43Z|00050|binding|INFO|Removing iface tap347c1cee-58 ovn-installed in OVS
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.196 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.199 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.201 229152 DEBUG nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Received event <DeviceRemovedEvent: 1764584263.200111, 601aa42c-17a5-4bf9-afe5-f481c5fc4648 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.202 229152 DEBUG nova.virt.libvirt.driver [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Start waiting for the detach event from libvirt for device tap347c1cee-58 with device alias net1 for instance 601aa42c-17a5-4bf9-afe5-f481c5fc4648 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.202 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]: 2025-12-01 10:17:43.206 229152 DEBUG nova.virt.libvirt.guest [None req-638588d3-ff0f-4c6e-840b-57db8c4f3b81 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:21:af:6f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap347c1cee-58"/></interface>not found in domain: <domain type='kvm' id='2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <name>instance-00000003</name>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <uuid>601aa42c-17a5-4bf9-afe5-f481c5fc4648</uuid>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1837747572</nova:name>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:17:41</nova:creationTime>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:port uuid="df689760-00c2-4efe-91db-f3bec7cc8992">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <nova:port uuid="347c1cee-581a-4693-b712-a2da5dba7d23">
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:17:43 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='serial'>601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='uuid'>601aa42c-17a5-4bf9-afe5-f481c5fc4648</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk' index='2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/601aa42c-17a5-4bf9-afe5-f481c5fc4648_disk.config' index='1'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:17:43 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:43.205 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:af:6f 10.100.0.20'], port_security=['fa:16:3e:21:af:6f 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '601aa42c-17a5-4bf9-afe5-f481c5fc4648', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f886aaab-3794-46e5-a7e6-c1b86b6b50cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11501149-732d-4202-ad97-ece49baad0dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d7f7415-fd59-4260-bb1c-2b5395bc38fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=347c1cee-581a-4693-b712-a2da5dba7d23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:17:43 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:43.206 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 347c1cee-581a-4693-b712-a2da5dba7d23 in datapath f886aaab-3794-46e5-a7e6-c1b86b6b50cc unbound from our chassis#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:17:43.207 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f886aaab-3794-46e5-a7e6-c1b86b6b50cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:ab:90:a9'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target dev='tapdf689760-00'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log' append='off'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/601aa42c-17a5-4bf9-afe5-f481c5fc4648/console.log' append='off'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:17:43 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:19:40 np0005540826 rsyslogd[1006]: imjournal: 2275 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  1 05:19:40 np0005540826 podman[238153]: 2025-12-01 10:19:40.9825461 +0000 UTC m=+0.063583042 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:19:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:19:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:19:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:19:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:41.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:19:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:42 np0005540826 nova_compute[229148]: 2025-12-01 10:19:42.231 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:42.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:43.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:44 np0005540826 nova_compute[229148]: 2025-12-01 10:19:44.454 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:45 np0005540826 nova_compute[229148]: 2025-12-01 10:19:45.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:45 np0005540826 nova_compute[229148]: 2025-12-01 10:19:45.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:19:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:47.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:47 np0005540826 nova_compute[229148]: 2025-12-01 10:19:47.197 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584372.1953368, 5d9e348b-9ab2-41c8-85ab-0a3fde7feeaf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:19:47 np0005540826 nova_compute[229148]: 2025-12-01 10:19:47.197 229152 INFO nova.compute.manager [-] [instance: 5d9e348b-9ab2-41c8-85ab-0a3fde7feeaf] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:19:47 np0005540826 nova_compute[229148]: 2025-12-01 10:19:47.234 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:47 np0005540826 nova_compute[229148]: 2025-12-01 10:19:47.378 229152 DEBUG nova.compute.manager [None req-f7d3f876-a5e8-403e-8976-0e60baa9edff - - - - - -] [instance: 5d9e348b-9ab2-41c8-85ab-0a3fde7feeaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:19:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:19:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:47.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:19:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:49 np0005540826 nova_compute[229148]: 2025-12-01 10:19:49.456 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:49.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:50 np0005540826 podman[238202]: 2025-12-01 10:19:50.018748174 +0000 UTC m=+0.101354338 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec  1 05:19:50 np0005540826 nova_compute[229148]: 2025-12-01 10:19:50.124 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:50 np0005540826 nova_compute[229148]: 2025-12-01 10:19:50.415 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:19:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:19:51 np0005540826 nova_compute[229148]: 2025-12-01 10:19:51.129 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:51 np0005540826 nova_compute[229148]: 2025-12-01 10:19:51.130 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:19:51 np0005540826 nova_compute[229148]: 2025-12-01 10:19:51.130 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:19:51 np0005540826 nova_compute[229148]: 2025-12-01 10:19:51.151 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:19:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:51.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:52 np0005540826 nova_compute[229148]: 2025-12-01 10:19:52.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:52 np0005540826 nova_compute[229148]: 2025-12-01 10:19:52.237 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:53.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:53 np0005540826 nova_compute[229148]: 2025-12-01 10:19:53.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:53 np0005540826 nova_compute[229148]: 2025-12-01 10:19:53.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:53 np0005540826 nova_compute[229148]: 2025-12-01 10:19:53.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:54.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:54 np0005540826 nova_compute[229148]: 2025-12-01 10:19:54.457 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:55.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.138 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.138 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.138 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:19:55 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2796817309' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.626 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:55 np0005540826 podman[238255]: 2025-12-01 10:19:55.717513619 +0000 UTC m=+0.052209568 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.791 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.792 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4985MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.793 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.793 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.990 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:19:55 np0005540826 nova_compute[229148]: 2025-12-01 10:19:55.990 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.100 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:56.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:19:56 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1156827399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.541 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.548 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.573 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.598 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.598 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.599 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.698 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.699 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.729 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.868 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.868 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.875 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:19:56 np0005540826 nova_compute[229148]: 2025-12-01 10:19:56.875 229152 INFO nova.compute.claims [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:19:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:57.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.046 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.239 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:57 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:19:57 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/129369375' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.506 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.512 229152 DEBUG nova.compute.provider_tree [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.534 229152 DEBUG nova.scheduler.client.report [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.570 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.571 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.620 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.621 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.624 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.655 229152 INFO nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.687 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.785 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.786 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.786 229152 INFO nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Creating image(s)#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.819 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.854 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.880 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.884 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.908 229152 DEBUG nova.policy [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.946 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.947 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.948 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.948 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.974 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:19:57 np0005540826 nova_compute[229148]: 2025-12-01 10:19:57.978 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.176 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.269 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.357 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:19:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.518 229152 DEBUG nova.objects.instance [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.531 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.531 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Ensure instance console log exists: /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.532 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.532 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.533 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:58 np0005540826 nova_compute[229148]: 2025-12-01 10:19:58.863 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Successfully created port: c9f60650-13a9-4fee-a03c-4878c9ea3a07 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:19:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:19:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:59.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.415 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Successfully updated port: c9f60650-13a9-4fee-a03c-4878c9ea3a07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.430 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.431 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.431 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.458 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.504 229152 DEBUG nova.compute.manager [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.505 229152 DEBUG nova.compute.manager [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing instance network info cache due to event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.505 229152 DEBUG oslo_concurrency.lockutils [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:19:59 np0005540826 nova_compute[229148]: 2025-12-01 10:19:59.726 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:20:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:00.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.370 229152 DEBUG nova.network.neutron [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.430 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.430 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Instance network_info: |[{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.430 229152 DEBUG oslo_concurrency.lockutils [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.431 229152 DEBUG nova.network.neutron [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.433 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Start _get_guest_xml network_info=[{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.437 229152 WARNING nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.440 229152 DEBUG nova.virt.libvirt.host [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.441 229152 DEBUG nova.virt.libvirt.host [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.443 229152 DEBUG nova.virt.libvirt.host [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.443 229152 DEBUG nova.virt.libvirt.host [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.444 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.444 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.444 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.445 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.445 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.445 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.445 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.445 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.446 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.446 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.446 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.446 229152 DEBUG nova.virt.hardware [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.449 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:00 np0005540826 ceph-mon[80026]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:20:00 np0005540826 ceph-mon[80026]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:20:00 np0005540826 ceph-mon[80026]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:20:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:20:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2855581033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.878 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.901 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:20:00 np0005540826 nova_compute[229148]: 2025-12-01 10:20:00.906 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:01.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:20:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2472683310' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.358 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.360 229152 DEBUG nova.virt.libvirt.vif [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:19:57Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.361 229152 DEBUG nova.network.os_vif_util [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.362 229152 DEBUG nova.network.os_vif_util [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.363 229152 DEBUG nova.objects.instance [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.385 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <uuid>c46b48c6-f17b-46fd-909f-65cf07dab4e6</uuid>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <name>instance-00000006</name>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:20:00</nova:creationTime>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="serial">c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="uuid">c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:0c:ce:19"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <target dev="tapc9f60650-13"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log" append="off"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:20:01 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:20:01 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:20:01 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:20:01 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.386 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Preparing to wait for external event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.387 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.387 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.387 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.388 229152 DEBUG nova.virt.libvirt.vif [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:19:57Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.388 229152 DEBUG nova.network.os_vif_util [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.389 229152 DEBUG nova.network.os_vif_util [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.389 229152 DEBUG os_vif [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.390 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.391 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.391 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.394 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.394 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9f60650-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.395 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9f60650-13, col_values=(('external_ids', {'iface-id': 'c9f60650-13a9-4fee-a03c-4878c9ea3a07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:ce:19', 'vm-uuid': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.396 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.398 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:20:01 np0005540826 NetworkManager[48989]: <info>  [1764584401.3978] manager: (tapc9f60650-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.403 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.404 229152 INFO os_vif [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13')#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.453 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.453 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.453 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:0c:ce:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.454 229152 INFO nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Using config drive#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.481 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:20:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.870 229152 DEBUG nova.network.neutron [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated VIF entry in instance network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.870 229152 DEBUG nova.network.neutron [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.901 229152 DEBUG oslo_concurrency.lockutils [req-7f6f9c3e-b6d6-4021-a480-39873b1c8e4c req-db41b5c8-f080-4690-92a7-2308763bacfb dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:01 np0005540826 nova_compute[229148]: 2025-12-01 10:20:01.997 229152 INFO nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Creating config drive at /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.003 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy9zujug9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.134 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy9zujug9" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.174 229152 DEBUG nova.storage.rbd_utils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.178 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:02.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.903 229152 DEBUG oslo_concurrency.processutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.724s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.904 229152 INFO nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Deleting local config drive /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/disk.config because it was imported into RBD.#033[00m
Dec  1 05:20:02 np0005540826 kernel: tapc9f60650-13: entered promiscuous mode
Dec  1 05:20:02 np0005540826 NetworkManager[48989]: <info>  [1764584402.9603] manager: (tapc9f60650-13): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Dec  1 05:20:02 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:02Z|00065|binding|INFO|Claiming lport c9f60650-13a9-4fee-a03c-4878c9ea3a07 for this chassis.
Dec  1 05:20:02 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:02Z|00066|binding|INFO|c9f60650-13a9-4fee-a03c-4878c9ea3a07: Claiming fa:16:3e:0c:ce:19 10.100.0.10
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.961 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:02 np0005540826 nova_compute[229148]: 2025-12-01 10:20:02.967 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.974 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:ce:19 10.100.0.10'], port_security=['fa:16:3e:0c:ce:19 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4e3600e5-3d5c-4f84-844a-9271184774dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5dbb3e99-00d5-4ffd-aeb7-01c272921789', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52a2e477-8890-44bf-b68f-dad9b43df531, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=c9f60650-13a9-4fee-a03c-4878c9ea3a07) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.975 141685 INFO neutron.agent.ovn.metadata.agent [-] Port c9f60650-13a9-4fee-a03c-4878c9ea3a07 in datapath 4e3600e5-3d5c-4f84-844a-9271184774dd bound to our chassis#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.976 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4e3600e5-3d5c-4f84-844a-9271184774dd#033[00m
Dec  1 05:20:02 np0005540826 systemd-machined[192474]: New machine qemu-4-instance-00000006.
Dec  1 05:20:02 np0005540826 systemd-udevd[238624]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.992 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9a939327-8466-4822-b2e5-63281a660b23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.993 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4e3600e5-31 in ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.995 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4e3600e5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.995 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[db00116f-ce74-4d7e-b8ae-22822f0a9017]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:02 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:02.997 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[455ab609-22f6-4421-b819-2e85988a78cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 NetworkManager[48989]: <info>  [1764584403.0039] device (tapc9f60650-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:20:03 np0005540826 NetworkManager[48989]: <info>  [1764584403.0053] device (tapc9f60650-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:20:03 np0005540826 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.012 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[58511b81-957e-4277-aaae-43ea55e329d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:03.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:03Z|00067|binding|INFO|Setting lport c9f60650-13a9-4fee-a03c-4878c9ea3a07 ovn-installed in OVS
Dec  1 05:20:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:03Z|00068|binding|INFO|Setting lport c9f60650-13a9-4fee-a03c-4878c9ea3a07 up in Southbound
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.041 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[bd68ed36-e9b5-4bcf-a15c-50d5b66518e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.069 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.075 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[21ff65c2-e3d2-4010-a528-2a7a367e9df3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.082 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[773fa5b0-49f9-4c5e-869b-47fcaf91f57a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 NetworkManager[48989]: <info>  [1764584403.0835] manager: (tap4e3600e5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.115 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[c4eda253-855f-4885-acad-89678c990e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.118 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[3c624c0c-5955-4ec7-96d3-adef02afd3a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 NetworkManager[48989]: <info>  [1764584403.1401] device (tap4e3600e5-30): carrier: link connected
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.145 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bfc0d7-a294-4ec3-a5a4-fa2622eb967c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.164 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[e82568ce-edde-4ffd-aeb7-c4e26e512783]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4e3600e5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:e5:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426566, 'reachable_time': 20288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238657, 'error': None, 'target': 'ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.182 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b4272c91-e139-4db7-a568-617dcd6ccf2e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:e54d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426566, 'tstamp': 426566}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238659, 'error': None, 'target': 'ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.205 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[90422d9c-122d-429b-aa9e-18f617a0f8c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4e3600e5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:e5:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426566, 'reachable_time': 20288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238660, 'error': None, 'target': 'ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.238 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9b64c649-0586-480b-8551-bdd8f2615ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.295 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[bf38d293-32df-4342-9e79-a9469e51f60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.297 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e3600e5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.297 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.297 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e3600e5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.299 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:03 np0005540826 kernel: tap4e3600e5-30: entered promiscuous mode
Dec  1 05:20:03 np0005540826 NetworkManager[48989]: <info>  [1764584403.3011] manager: (tap4e3600e5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.302 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4e3600e5-30, col_values=(('external_ids', {'iface-id': 'f9f66464-5468-467b-973f-59839c1856fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.303 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.305 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.306 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4e3600e5-3d5c-4f84-844a-9271184774dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4e3600e5-3d5c-4f84-844a-9271184774dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:20:03 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:03Z|00069|binding|INFO|Releasing lport f9f66464-5468-467b-973f-59839c1856fe from this chassis (sb_readonly=0)
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.348 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a89df31f-b8ee-4fff-8ed6-71811975440e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.350 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-4e3600e5-3d5c-4f84-844a-9271184774dd
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/4e3600e5-3d5c-4f84-844a-9271184774dd.pid.haproxy
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 4e3600e5-3d5c-4f84-844a-9271184774dd
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:20:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:03.351 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd', 'env', 'PROCESS_TAG=haproxy-4e3600e5-3d5c-4f84-844a-9271184774dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4e3600e5-3d5c-4f84-844a-9271184774dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.367 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.709 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584403.7091837, c46b48c6-f17b-46fd-909f-65cf07dab4e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.710 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] VM Started (Lifecycle Event)#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.731 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.735 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584403.7125988, c46b48c6-f17b-46fd-909f-65cf07dab4e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.736 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.751 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.755 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.775 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:20:03 np0005540826 podman[238733]: 2025-12-01 10:20:03.705150693 +0000 UTC m=+0.028617237 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:20:03 np0005540826 podman[238733]: 2025-12-01 10:20:03.856500572 +0000 UTC m=+0.179967096 container create f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.875 229152 DEBUG nova.compute.manager [req-024c5dc7-128b-4a08-9512-246b7edcc12c req-1eb255b0-aea0-4d22-a606-c7c0566176c9 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.876 229152 DEBUG oslo_concurrency.lockutils [req-024c5dc7-128b-4a08-9512-246b7edcc12c req-1eb255b0-aea0-4d22-a606-c7c0566176c9 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.876 229152 DEBUG oslo_concurrency.lockutils [req-024c5dc7-128b-4a08-9512-246b7edcc12c req-1eb255b0-aea0-4d22-a606-c7c0566176c9 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.876 229152 DEBUG oslo_concurrency.lockutils [req-024c5dc7-128b-4a08-9512-246b7edcc12c req-1eb255b0-aea0-4d22-a606-c7c0566176c9 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.877 229152 DEBUG nova.compute.manager [req-024c5dc7-128b-4a08-9512-246b7edcc12c req-1eb255b0-aea0-4d22-a606-c7c0566176c9 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Processing event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.878 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.881 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584403.8810227, c46b48c6-f17b-46fd-909f-65cf07dab4e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.881 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.883 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.887 229152 INFO nova.virt.libvirt.driver [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Instance spawned successfully.#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.887 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.908 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:20:03 np0005540826 systemd[1]: Started libpod-conmon-f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b.scope.
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.913 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.916 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.917 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.917 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.917 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.918 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.918 229152 DEBUG nova.virt.libvirt.driver [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.941 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:20:03 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:20:03 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7708c844f0fb33f55d88e6d9cfa28c7cc086bd7f5d6a6e81922c9692b15632f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:20:03 np0005540826 podman[238733]: 2025-12-01 10:20:03.963316466 +0000 UTC m=+0.286783000 container init f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:20:03 np0005540826 podman[238733]: 2025-12-01 10:20:03.96904869 +0000 UTC m=+0.292515214 container start f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.976 229152 INFO nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Took 6.19 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:20:03 np0005540826 nova_compute[229148]: 2025-12-01 10:20:03.977 229152 DEBUG nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:20:03 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [NOTICE]   (238753) : New worker (238755) forked
Dec  1 05:20:03 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [NOTICE]   (238753) : Loading success.
Dec  1 05:20:04 np0005540826 nova_compute[229148]: 2025-12-01 10:20:04.034 229152 INFO nova.compute.manager [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Took 7.26 seconds to build instance.#033[00m
Dec  1 05:20:04 np0005540826 nova_compute[229148]: 2025-12-01 10:20:04.050 229152 DEBUG oslo_concurrency.lockutils [None req-76200f85-e458-4e87-b769-a645e31d048c 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:04.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:04 np0005540826 nova_compute[229148]: 2025-12-01 10:20:04.460 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:04.551 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:05.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.982 229152 DEBUG nova.compute.manager [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.983 229152 DEBUG oslo_concurrency.lockutils [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.983 229152 DEBUG oslo_concurrency.lockutils [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.984 229152 DEBUG oslo_concurrency.lockutils [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.984 229152 DEBUG nova.compute.manager [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:20:05 np0005540826 nova_compute[229148]: 2025-12-01 10:20:05.984 229152 WARNING nova.compute.manager [req-2aa425f9-c4ab-4b6e-9033-1b6661e93183 req-b28013c0-4a96-4c84-a601-67a6581e0c6a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:20:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:06.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:06 np0005540826 nova_compute[229148]: 2025-12-01 10:20:06.397 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:07.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:20:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3326335386' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:20:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:20:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3326335386' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:20:07 np0005540826 nova_compute[229148]: 2025-12-01 10:20:07.127 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:07.127 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:20:07 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:07.128 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:20:08 np0005540826 nova_compute[229148]: 2025-12-01 10:20:08.088 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:08 np0005540826 NetworkManager[48989]: <info>  [1764584408.0894] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec  1 05:20:08 np0005540826 NetworkManager[48989]: <info>  [1764584408.0904] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec  1 05:20:08 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:08Z|00070|binding|INFO|Releasing lport f9f66464-5468-467b-973f-59839c1856fe from this chassis (sb_readonly=0)
Dec  1 05:20:08 np0005540826 nova_compute[229148]: 2025-12-01 10:20:08.123 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:08 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:08Z|00071|binding|INFO|Releasing lport f9f66464-5468-467b-973f-59839c1856fe from this chassis (sb_readonly=0)
Dec  1 05:20:08 np0005540826 nova_compute[229148]: 2025-12-01 10:20:08.130 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:08.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:09.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.047 229152 DEBUG nova.compute.manager [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.048 229152 DEBUG nova.compute.manager [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing instance network info cache due to event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.049 229152 DEBUG oslo_concurrency.lockutils [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.049 229152 DEBUG oslo_concurrency.lockutils [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.050 229152 DEBUG nova.network.neutron [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:20:09 np0005540826 nova_compute[229148]: 2025-12-01 10:20:09.463 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:10.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:10 np0005540826 nova_compute[229148]: 2025-12-01 10:20:10.890 229152 DEBUG nova.network.neutron [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated VIF entry in instance network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:20:10 np0005540826 nova_compute[229148]: 2025-12-01 10:20:10.892 229152 DEBUG nova.network.neutron [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:10 np0005540826 nova_compute[229148]: 2025-12-01 10:20:10.913 229152 DEBUG oslo_concurrency.lockutils [req-c384fa58-4c75-4dcb-8510-89e7c2e5ee6a req-39c69f44-4cba-4cba-9b37-727e5edddf1f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:11.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:11 np0005540826 nova_compute[229148]: 2025-12-01 10:20:11.400 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:11 np0005540826 podman[238794]: 2025-12-01 10:20:11.99923105 +0000 UTC m=+0.072440275 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:20:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:12.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:13.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:14.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:14 np0005540826 nova_compute[229148]: 2025-12-01 10:20:14.465 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:15.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:15 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:15.130 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:16.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:16 np0005540826 nova_compute[229148]: 2025-12-01 10:20:16.402 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:17.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:17 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:17Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0c:ce:19 10.100.0.10
Dec  1 05:20:17 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:17Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:ce:19 10.100.0.10
Dec  1 05:20:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:18.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:19.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:19 np0005540826 nova_compute[229148]: 2025-12-01 10:20:19.468 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:21 np0005540826 podman[238820]: 2025-12-01 10:20:21.00327353 +0000 UTC m=+0.087626165 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:20:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:21.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:21 np0005540826 nova_compute[229148]: 2025-12-01 10:20:21.404 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:22.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:23.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:23 np0005540826 nova_compute[229148]: 2025-12-01 10:20:23.477 229152 INFO nova.compute.manager [None req-bd842777-6b20-4ad4-8154-37c9471b305e 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Get console output#033[00m
Dec  1 05:20:23 np0005540826 nova_compute[229148]: 2025-12-01 10:20:23.482 234904 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  1 05:20:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:24.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:24 np0005540826 nova_compute[229148]: 2025-12-01 10:20:24.470 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:25.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:25 np0005540826 podman[238873]: 2025-12-01 10:20:25.997977058 +0000 UTC m=+0.084574409 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  1 05:20:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:26.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:26 np0005540826 nova_compute[229148]: 2025-12-01 10:20:26.407 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:26 np0005540826 nova_compute[229148]: 2025-12-01 10:20:26.875 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:26 np0005540826 nova_compute[229148]: 2025-12-01 10:20:26.875 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:26 np0005540826 nova_compute[229148]: 2025-12-01 10:20:26.876 229152 DEBUG nova.objects.instance [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'flavor' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:20:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:27.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:27 np0005540826 nova_compute[229148]: 2025-12-01 10:20:27.899 229152 DEBUG nova.objects.instance [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_requests' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:20:27 np0005540826 nova_compute[229148]: 2025-12-01 10:20:27.916 229152 DEBUG nova.network.neutron [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:20:28 np0005540826 nova_compute[229148]: 2025-12-01 10:20:28.253 229152 DEBUG nova.policy [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:20:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:28.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:28 np0005540826 nova_compute[229148]: 2025-12-01 10:20:28.846 229152 DEBUG nova.network.neutron [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Successfully created port: 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:20:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:29.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:29 np0005540826 nova_compute[229148]: 2025-12-01 10:20:29.472 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.056 229152 DEBUG nova.network.neutron [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Successfully updated port: 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.079 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.080 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.080 229152 DEBUG nova.network.neutron [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.231 229152 DEBUG nova.compute.manager [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-changed-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.232 229152 DEBUG nova.compute.manager [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing instance network info cache due to event network-changed-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:20:30 np0005540826 nova_compute[229148]: 2025-12-01 10:20:30.232 229152 DEBUG oslo_concurrency.lockutils [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:20:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:30.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:31.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:31 np0005540826 nova_compute[229148]: 2025-12-01 10:20:31.409 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.270 229152 DEBUG nova.network.neutron [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.292 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.293 229152 DEBUG oslo_concurrency.lockutils [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.293 229152 DEBUG nova.network.neutron [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing network info cache for port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.296 229152 DEBUG nova.virt.libvirt.vif [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.297 229152 DEBUG nova.network.os_vif_util [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.297 229152 DEBUG nova.network.os_vif_util [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.298 229152 DEBUG os_vif [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.298 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.299 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.299 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.304 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.304 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61c72b1f-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.305 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61c72b1f-ae, col_values=(('external_ids', {'iface-id': '61c72b1f-aea9-4ec0-80c5-03c2e87aad8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:2f:0e', 'vm-uuid': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.307 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.3078] manager: (tap61c72b1f-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.309 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.313 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.314 229152 INFO os_vif [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae')#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.315 229152 DEBUG nova.virt.libvirt.vif [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.315 229152 DEBUG nova.network.os_vif_util [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.316 229152 DEBUG nova.network.os_vif_util [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.319 229152 DEBUG nova.virt.libvirt.guest [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] attach device xml: <interface type="ethernet">
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:71:2f:0e"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <target dev="tap61c72b1f-ae"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:20:32 np0005540826 nova_compute[229148]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  1 05:20:32 np0005540826 kernel: tap61c72b1f-ae: entered promiscuous mode
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.3327] manager: (tap61c72b1f-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Dec  1 05:20:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:32Z|00072|binding|INFO|Claiming lport 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f for this chassis.
Dec  1 05:20:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:32Z|00073|binding|INFO|61c72b1f-aea9-4ec0-80c5-03c2e87aad8f: Claiming fa:16:3e:71:2f:0e 10.100.0.28
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.335 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.344 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:2f:0e 10.100.0.28'], port_security=['fa:16:3e:71:2f:0e 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '11501149-732d-4202-ad97-ece49baad0dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26afc505-4eba-4dd1-91d0-5142d49a2356, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.346 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f in datapath 0e5b3de9-56f5-4f4d-87c1-c01596567748 bound to our chassis#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.347 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e5b3de9-56f5-4f4d-87c1-c01596567748#033[00m
Dec  1 05:20:32 np0005540826 systemd-udevd[238966]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.362 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8606d96e-4c40-498a-a19f-f34b3d92eb6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.363 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e5b3de9-51 in ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.366 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e5b3de9-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.366 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecae563-f6e4-4b3b-b5fc-5c36041b738e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.370 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[4171bf68-9a0d-44d2-900a-ba541ce94c2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.3773] device (tap61c72b1f-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.3834] device (tap61c72b1f-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.385 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[ea63cd02-d596-4c84-93e3-3f0b9b7e9ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.390 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:32Z|00074|binding|INFO|Setting lport 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f ovn-installed in OVS
Dec  1 05:20:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:32Z|00075|binding|INFO|Setting lport 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f up in Southbound
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.392 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.401 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac15a69-398a-4bdd-9708-7abf53d82de7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:32.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.448 229152 DEBUG nova.virt.libvirt.driver [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.448 229152 DEBUG nova.virt.libvirt.driver [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.448 229152 DEBUG nova.virt.libvirt.driver [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:0c:ce:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.448 229152 DEBUG nova.virt.libvirt.driver [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:71:2f:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.457 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0f1ac4-0af6-4752-aad3-1da339cb042e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.463 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8524bd40-4a73-4018-8f43-c3f6b2c5b7af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 systemd-udevd[238971]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.4649] manager: (tap0e5b3de9-50): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.471 229152 DEBUG nova.virt.libvirt.guest [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:20:32</nova:creationTime>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:20:32 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    <nova:port uuid="61c72b1f-aea9-4ec0-80c5-03c2e87aad8f">
Dec  1 05:20:32 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:20:32 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:20:32 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:20:32 np0005540826 nova_compute[229148]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.501 229152 DEBUG oslo_concurrency.lockutils [None req-f3c01b3a-07e7-4b54-9c34-11b5a05c31aa 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.500 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad658c4-1ed0-48dd-bc9d-9ed89f8346d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.504 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[6418f56d-47c2-4f81-a59c-17ddfe15fe58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.5262] device (tap0e5b3de9-50): carrier: link connected
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.531 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[e5eb31db-0f1c-4d41-bf62-b3df09009f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.554 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[14a4fea5-43fa-44be-865e-e69f9a3c5105]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e5b3de9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:1d:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429504, 'reachable_time': 15604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238998, 'error': None, 'target': 'ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.575 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[022b2705-2598-4041-b4e2-a515ecba5214]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:1dc4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 429504, 'tstamp': 429504}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238999, 'error': None, 'target': 'ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.599 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[11634e17-4dbe-4a92-860f-7ab985e51132]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e5b3de9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:1d:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429504, 'reachable_time': 15604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239006, 'error': None, 'target': 'ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.639 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[4b402eb7-806f-4ab7-9ff4-bd1559c6183a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.707 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2e7d10-f168-4a38-9f28-f19f8611075e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.709 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e5b3de9-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.709 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.709 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e5b3de9-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.711 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 kernel: tap0e5b3de9-50: entered promiscuous mode
Dec  1 05:20:32 np0005540826 NetworkManager[48989]: <info>  [1764584432.7124] manager: (tap0e5b3de9-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.713 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.714 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e5b3de9-50, col_values=(('external_ids', {'iface-id': '9d1b1966-b85d-4c48-9bc3-59ebb92f0fa7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.715 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:32Z|00076|binding|INFO|Releasing lport 9d1b1966-b85d-4c48-9bc3-59ebb92f0fa7 from this chassis (sb_readonly=0)
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.716 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.717 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e5b3de9-56f5-4f4d-87c1-c01596567748.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e5b3de9-56f5-4f4d-87c1-c01596567748.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.718 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9a22dbee-eb99-4237-9152-94d8b70dea8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.718 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-0e5b3de9-56f5-4f4d-87c1-c01596567748
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/0e5b3de9-56f5-4f4d-87c1-c01596567748.pid.haproxy
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 0e5b3de9-56f5-4f4d-87c1-c01596567748
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:20:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:20:32.719 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'env', 'PROCESS_TAG=haproxy-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e5b3de9-56f5-4f4d-87c1-c01596567748.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.729 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.993 229152 DEBUG nova.compute.manager [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.994 229152 DEBUG oslo_concurrency.lockutils [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.994 229152 DEBUG oslo_concurrency.lockutils [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.994 229152 DEBUG oslo_concurrency.lockutils [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.994 229152 DEBUG nova.compute.manager [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:20:32 np0005540826 nova_compute[229148]: 2025-12-01 10:20:32.994 229152 WARNING nova.compute.manager [req-87f87f33-6842-4015-85e7-ba80c09eab33 req-303053ad-5705-4a72-bb59-9325ee72277d dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f for instance with vm_state active and task_state None.#033[00m
Dec  1 05:20:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:33.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:33 np0005540826 podman[239044]: 2025-12-01 10:20:33.090522254 +0000 UTC m=+0.056902815 container create 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 05:20:33 np0005540826 systemd[1]: Started libpod-conmon-766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e.scope.
Dec  1 05:20:33 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:20:33 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:33 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:33 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:20:33 np0005540826 podman[239044]: 2025-12-01 10:20:33.067605981 +0000 UTC m=+0.033986562 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:20:33 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:20:33 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d058fcadc33cee113e06b74707e1ac5bf27f3bb3aab2826468c429150cc7fff9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:20:33 np0005540826 podman[239044]: 2025-12-01 10:20:33.181469891 +0000 UTC m=+0.147850472 container init 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:20:33 np0005540826 podman[239044]: 2025-12-01 10:20:33.186788234 +0000 UTC m=+0.153168795 container start 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:20:33 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [NOTICE]   (239063) : New worker (239065) forked
Dec  1 05:20:33 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [NOTICE]   (239063) : Loading success.
Dec  1 05:20:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:33Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:2f:0e 10.100.0.28
Dec  1 05:20:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:20:33Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:2f:0e 10.100.0.28
Dec  1 05:20:33 np0005540826 nova_compute[229148]: 2025-12-01 10:20:33.954 229152 DEBUG nova.network.neutron [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated VIF entry in instance network info cache for port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:20:33 np0005540826 nova_compute[229148]: 2025-12-01 10:20:33.955 229152 DEBUG nova.network.neutron [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:33 np0005540826 nova_compute[229148]: 2025-12-01 10:20:33.975 229152 DEBUG oslo_concurrency.lockutils [req-e19fe0d6-19f4-4158-ba7c-5cb13099fc47 req-fb40d4be-abd2-4b18-955f-fa34cf32f435 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:34.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:34 np0005540826 nova_compute[229148]: 2025-12-01 10:20:34.474 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:35.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.169 229152 DEBUG nova.compute.manager [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.169 229152 DEBUG oslo_concurrency.lockutils [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.169 229152 DEBUG oslo_concurrency.lockutils [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.170 229152 DEBUG oslo_concurrency.lockutils [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.170 229152 DEBUG nova.compute.manager [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:20:35 np0005540826 nova_compute[229148]: 2025-12-01 10:20:35.170 229152 WARNING nova.compute.manager [req-2b855587-f549-43e0-a7e6-ebdbddda3514 req-6636f818-2056-4ac5-9bb6-58a3d49217b6 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f for instance with vm_state active and task_state None.#033[00m
Dec  1 05:20:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:36.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:37.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:37 np0005540826 nova_compute[229148]: 2025-12-01 10:20:37.308 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:38.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:39.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:39 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:39 np0005540826 nova_compute[229148]: 2025-12-01 10:20:39.513 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:40.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:41.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:42 np0005540826 nova_compute[229148]: 2025-12-01 10:20:42.311 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:42.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:42 np0005540826 podman[239104]: 2025-12-01 10:20:42.982132551 +0000 UTC m=+0.066233192 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:20:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:43.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:44.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:44 np0005540826 nova_compute[229148]: 2025-12-01 10:20:44.556 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:45.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:46.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:47.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:47 np0005540826 nova_compute[229148]: 2025-12-01 10:20:47.314 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:48.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:49.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:49 np0005540826 nova_compute[229148]: 2025-12-01 10:20:49.558 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:50.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:51.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.176 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.176 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.176 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:20:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.770 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.771 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.771 229152 DEBUG nova.network.neutron [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  1 05:20:51 np0005540826 nova_compute[229148]: 2025-12-01 10:20:51.771 229152 DEBUG nova.objects.instance [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:20:52 np0005540826 podman[239154]: 2025-12-01 10:20:52.022531898 +0000 UTC m=+0.102277906 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:20:52 np0005540826 nova_compute[229148]: 2025-12-01 10:20:52.327 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:52.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:53.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:54.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:54 np0005540826 nova_compute[229148]: 2025-12-01 10:20:54.590 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:55.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:55 np0005540826 nova_compute[229148]: 2025-12-01 10:20:55.866 229152 DEBUG nova.network.neutron [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:20:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:56.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.574 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.574 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.575 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.575 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.575 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:56 np0005540826 nova_compute[229148]: 2025-12-01 10:20:56.576 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:56 np0005540826 podman[239185]: 2025-12-01 10:20:56.977894045 +0000 UTC m=+0.057981715 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:57.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.135 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.137 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.138 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.330 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:20:57 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:20:57 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1242197287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.596 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.713 229152 DEBUG nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.714 229152 DEBUG nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.910 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.911 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4758MB free_disk=59.9217529296875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.912 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:57 np0005540826 nova_compute[229148]: 2025-12-01 10:20:57.912 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.127 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.128 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.128 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.173 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:58.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:20:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3258920012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.637 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:58 np0005540826 nova_compute[229148]: 2025-12-01 10:20:58.644 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:20:59 np0005540826 nova_compute[229148]: 2025-12-01 10:20:59.008 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:20:59 np0005540826 nova_compute[229148]: 2025-12-01 10:20:59.028 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:20:59 np0005540826 nova_compute[229148]: 2025-12-01 10:20:59.029 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:20:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:59.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:59 np0005540826 nova_compute[229148]: 2025-12-01 10:20:59.593 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:00 np0005540826 nova_compute[229148]: 2025-12-01 10:21:00.025 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:00.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:01.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:02 np0005540826 nova_compute[229148]: 2025-12-01 10:21:02.334 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:02.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:04.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:04.554 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:04 np0005540826 nova_compute[229148]: 2025-12-01 10:21:04.596 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:05.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:06.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:21:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514814098' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:21:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:21:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1514814098' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:21:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:07.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:07 np0005540826 nova_compute[229148]: 2025-12-01 10:21:07.336 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:08.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.596 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:08 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:08.597 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:08 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:08.598 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.897 229152 DEBUG nova.compute.manager [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-changed-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.898 229152 DEBUG nova.compute.manager [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing instance network info cache due to event network-changed-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.898 229152 DEBUG oslo_concurrency.lockutils [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.898 229152 DEBUG oslo_concurrency.lockutils [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:21:08 np0005540826 nova_compute[229148]: 2025-12-01 10:21:08.898 229152 DEBUG nova.network.neutron [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing network info cache for port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:21:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:09.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:09.601 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:09 np0005540826 nova_compute[229148]: 2025-12-01 10:21:09.642 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:10.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:10 np0005540826 nova_compute[229148]: 2025-12-01 10:21:10.620 229152 DEBUG nova.network.neutron [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated VIF entry in instance network info cache for port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:21:10 np0005540826 nova_compute[229148]: 2025-12-01 10:21:10.621 229152 DEBUG nova.network.neutron [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:21:10 np0005540826 nova_compute[229148]: 2025-12-01 10:21:10.638 229152 DEBUG oslo_concurrency.lockutils [req-9498248e-849d-497d-8e99-1a48f17d7ab5 req-5f1de2e3-c4b8-48ac-8206-ab5640a05593 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:21:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:11.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:12 np0005540826 nova_compute[229148]: 2025-12-01 10:21:12.339 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:12.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:13.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:13 np0005540826 podman[239284]: 2025-12-01 10:21:13.988223627 +0000 UTC m=+0.071297919 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  1 05:21:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:14.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:14 np0005540826 nova_compute[229148]: 2025-12-01 10:21:14.680 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.468158) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475468252, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1215, "num_deletes": 256, "total_data_size": 2765552, "memory_usage": 2807752, "flush_reason": "Manual Compaction"}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475485480, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1819441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28315, "largest_seqno": 29525, "table_properties": {"data_size": 1814239, "index_size": 2598, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11246, "raw_average_key_size": 19, "raw_value_size": 1803652, "raw_average_value_size": 3057, "num_data_blocks": 116, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584378, "oldest_key_time": 1764584378, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 17384 microseconds, and 5698 cpu microseconds.
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.485552) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1819441 bytes OK
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.485578) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.487779) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.487840) EVENT_LOG_v1 {"time_micros": 1764584475487828, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.487869) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 2759671, prev total WAL file size 2759671, number of live WAL files 2.
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.489031) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373535' seq:0, type:0; will stop at (end)
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1776KB)], [54(14MB)]
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475489147, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16965018, "oldest_snapshot_seqno": -1}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5945 keys, 16845976 bytes, temperature: kUnknown
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475578250, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16845976, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16803171, "index_size": 26823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14917, "raw_key_size": 151865, "raw_average_key_size": 25, "raw_value_size": 16692675, "raw_average_value_size": 2807, "num_data_blocks": 1098, "num_entries": 5945, "num_filter_entries": 5945, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.578511) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16845976 bytes
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.579730) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.2 rd, 188.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 14.4 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(18.6) write-amplify(9.3) OK, records in: 6471, records dropped: 526 output_compression: NoCompression
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.579753) EVENT_LOG_v1 {"time_micros": 1764584475579742, "job": 32, "event": "compaction_finished", "compaction_time_micros": 89179, "compaction_time_cpu_micros": 36099, "output_level": 6, "num_output_files": 1, "total_output_size": 16845976, "num_input_records": 6471, "num_output_records": 5945, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475580403, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475583660, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.488872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.583749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.583757) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.583759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.583761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:21:15.583763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:16.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:17 np0005540826 nova_compute[229148]: 2025-12-01 10:21:17.342 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:18.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:19.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:19 np0005540826 nova_compute[229148]: 2025-12-01 10:21:19.683 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:20.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:21.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:22 np0005540826 nova_compute[229148]: 2025-12-01 10:21:22.344 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:22.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:23 np0005540826 podman[239309]: 2025-12-01 10:21:23.00687706 +0000 UTC m=+0.086449679 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:21:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:23.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:24.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:24 np0005540826 nova_compute[229148]: 2025-12-01 10:21:24.685 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 05:21:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:26.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 05:21:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:27.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:27 np0005540826 nova_compute[229148]: 2025-12-01 10:21:27.346 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:27 np0005540826 podman[239363]: 2025-12-01 10:21:27.975253043 +0000 UTC m=+0.051581435 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:21:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:28.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:29 np0005540826 nova_compute[229148]: 2025-12-01 10:21:29.688 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:30.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:31.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:32 np0005540826 nova_compute[229148]: 2025-12-01 10:21:32.349 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:34 np0005540826 nova_compute[229148]: 2025-12-01 10:21:34.690 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:37 np0005540826 nova_compute[229148]: 2025-12-01 10:21:37.352 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:38.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:38 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:38Z|00077|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Dec  1 05:21:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:39.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:39 np0005540826 nova_compute[229148]: 2025-12-01 10:21:39.693 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:21:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:40 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:21:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:40.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:41.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:42 np0005540826 nova_compute[229148]: 2025-12-01 10:21:42.354 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:42.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:43.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:44.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:44 np0005540826 nova_compute[229148]: 2025-12-01 10:21:44.695 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:44 np0005540826 podman[239473]: 2025-12-01 10:21:44.983283558 +0000 UTC m=+0.060598280 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 05:21:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:45.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:21:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:46.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:21:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:47.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:47 np0005540826 nova_compute[229148]: 2025-12-01 10:21:47.356 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:48 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:48.303 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:48 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:48.304 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:21:48 np0005540826 nova_compute[229148]: 2025-12-01 10:21:48.304 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:48.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:49.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:49 np0005540826 nova_compute[229148]: 2025-12-01 10:21:49.697 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:50.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:51 np0005540826 nova_compute[229148]: 2025-12-01 10:21:51.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:52 np0005540826 nova_compute[229148]: 2025-12-01 10:21:52.358 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:52.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:21:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:21:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.802 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.803 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.803 229152 DEBUG nova.network.neutron [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  1 05:21:53 np0005540826 nova_compute[229148]: 2025-12-01 10:21:53.803 229152 DEBUG nova.objects.instance [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:21:53 np0005540826 podman[239549]: 2025-12-01 10:21:53.999621141 +0000 UTC m=+0.085042505 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.264 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.265 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.277 229152 DEBUG nova.objects.instance [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'flavor' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.298 229152 DEBUG nova.virt.libvirt.vif [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.299 229152 DEBUG nova.network.os_vif_util [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.299 229152 DEBUG nova.network.os_vif_util [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.304 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.306 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.308 229152 DEBUG nova.virt.libvirt.driver [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Attempting to detach device tap61c72b1f-ae from instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.309 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] detach device xml: <interface type="ethernet">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:71:2f:0e"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <target dev="tap61c72b1f-ae"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.315 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.320 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface>not found in domain: <domain type='kvm' id='4'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <name>instance-00000006</name>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <uuid>c46b48c6-f17b-46fd-909f-65cf07dab4e6</uuid>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:20:32</nova:creationTime>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:port uuid="61c72b1f-aea9-4ec0-80c5-03c2e87aad8f">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='serial'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='uuid'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk' index='2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config' index='1'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:0c:ce:19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='tapc9f60650-13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:71:2f:0e'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='tap61c72b1f-ae'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='net1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <rng model='virtio'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <backend model='random'>/dev/urandom</backend>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='rng0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <label>system_u:system_r:svirt_t:s0:c307,c925</label>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c925</imagelabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <label>+107:+107</label>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <imagelabel>+107:+107</imagelabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.321 229152 INFO nova.virt.libvirt.driver [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully detached device tap61c72b1f-ae from instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 from the persistent domain config.#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.321 229152 DEBUG nova.virt.libvirt.driver [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] (1/8): Attempting to detach device tap61c72b1f-ae with device alias net1 from instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.321 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] detach device xml: <interface type="ethernet">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <mac address="fa:16:3e:71:2f:0e"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <model type="virtio"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <mtu size="1442"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <target dev="tap61c72b1f-ae"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </interface>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  1 05:21:54 np0005540826 kernel: tap61c72b1f-ae (unregistering): left promiscuous mode
Dec  1 05:21:54 np0005540826 NetworkManager[48989]: <info>  [1764584514.3664] device (tap61c72b1f-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.378 229152 DEBUG nova.virt.libvirt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Received event <DeviceRemovedEvent: 1764584514.378296, c46b48c6-f17b-46fd-909f-65cf07dab4e6 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.380 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:54Z|00078|binding|INFO|Releasing lport 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f from this chassis (sb_readonly=0)
Dec  1 05:21:54 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:54Z|00079|binding|INFO|Setting lport 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f down in Southbound
Dec  1 05:21:54 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:54Z|00080|binding|INFO|Removing iface tap61c72b1f-ae ovn-installed in OVS
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.382 229152 DEBUG nova.virt.libvirt.driver [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Start waiting for the detach event from libvirt for device tap61c72b1f-ae with device alias net1 for instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.382 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.383 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.387 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface>not found in domain: <domain type='kvm' id='4'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <name>instance-00000006</name>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <uuid>c46b48c6-f17b-46fd-909f-65cf07dab4e6</uuid>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:20:32</nova:creationTime>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:port uuid="61c72b1f-aea9-4ec0-80c5-03c2e87aad8f">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='serial'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='uuid'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk' index='2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config' index='1'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:0c:ce:19'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target dev='tapc9f60650-13'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <rng model='virtio'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <backend model='random'>/dev/urandom</backend>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <alias name='rng0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <label>system_u:system_r:svirt_t:s0:c307,c925</label>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c925</imagelabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <label>+107:+107</label>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <imagelabel>+107:+107</imagelabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.387 229152 INFO nova.virt.libvirt.driver [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully detached device tap61c72b1f-ae from instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 from the live domain config.#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.388 229152 DEBUG nova.virt.libvirt.vif [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.388 229152 DEBUG nova.network.os_vif_util [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.389 229152 DEBUG nova.network.os_vif_util [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.389 229152 DEBUG os_vif [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.391 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.391 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61c72b1f-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.393 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.394 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.398 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.401 229152 INFO os_vif [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae')#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.401 229152 DEBUG nova.virt.libvirt.guest [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:21:54</nova:creationTime>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:54 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:54 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:54 np0005540826 nova_compute[229148]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.438 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:2f:0e 10.100.0.28', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26afc505-4eba-4dd1-91d0-5142d49a2356, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.439 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f in datapath 0e5b3de9-56f5-4f4d-87c1-c01596567748 unbound from our chassis#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.440 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e5b3de9-56f5-4f4d-87c1-c01596567748, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.441 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1173ba-21f4-45c2-ab4d-549cbfa596de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.442 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748 namespace which is not needed anymore#033[00m
Dec  1 05:21:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:54.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:54 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [NOTICE]   (239063) : haproxy version is 2.8.14-c23fe91
Dec  1 05:21:54 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [NOTICE]   (239063) : path to executable is /usr/sbin/haproxy
Dec  1 05:21:54 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [WARNING]  (239063) : Exiting Master process...
Dec  1 05:21:54 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [ALERT]    (239063) : Current worker (239065) exited with code 143 (Terminated)
Dec  1 05:21:54 np0005540826 neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748[239059]: [WARNING]  (239063) : All workers exited. Exiting... (0)
Dec  1 05:21:54 np0005540826 systemd[1]: libpod-766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e.scope: Deactivated successfully.
Dec  1 05:21:54 np0005540826 podman[239599]: 2025-12-01 10:21:54.576832841 +0000 UTC m=+0.045175533 container died 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:21:54 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e-userdata-shm.mount: Deactivated successfully.
Dec  1 05:21:54 np0005540826 systemd[1]: var-lib-containers-storage-overlay-d058fcadc33cee113e06b74707e1ac5bf27f3bb3aab2826468c429150cc7fff9-merged.mount: Deactivated successfully.
Dec  1 05:21:54 np0005540826 podman[239599]: 2025-12-01 10:21:54.618453229 +0000 UTC m=+0.086795911 container cleanup 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:21:54 np0005540826 systemd[1]: libpod-conmon-766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e.scope: Deactivated successfully.
Dec  1 05:21:54 np0005540826 podman[239629]: 2025-12-01 10:21:54.688397135 +0000 UTC m=+0.046890059 container remove 766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.695 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9f1edc-8f53-4511-a1ea-2f886b7be957]: (4, ('Mon Dec  1 10:21:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748 (766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e)\n766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e\nMon Dec  1 10:21:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748 (766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e)\n766caed8a99adb8dce5fa589b713ffd1014a25ca81157b88500f2c485f16f61e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.697 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[45d27b32-e112-491d-a04e-b6244522afcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.698 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e5b3de9-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.699 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 kernel: tap0e5b3de9-50: left promiscuous mode
Dec  1 05:21:54 np0005540826 nova_compute[229148]: 2025-12-01 10:21:54.712 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.715 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[536238d1-672c-4482-980e-6b5612f37da1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.729 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab124b6-e252-47b5-86fb-20eaabd13b26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.731 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1511f805-f9e9-49c8-ab25-30f548a6607d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.750 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[e8eaf417-2313-4ea8-9cc3-76c0069c5485]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429497, 'reachable_time': 43060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239644, 'error': None, 'target': 'ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.753 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e5b3de9-56f5-4f4d-87c1-c01596567748 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:21:54 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:54.753 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c7a8eb-b9f8-41dd-a0e5-47d93191b5fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:54 np0005540826 systemd[1]: run-netns-ovnmeta\x2d0e5b3de9\x2d56f5\x2d4f4d\x2d87c1\x2dc01596567748.mount: Deactivated successfully.
Dec  1 05:21:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:55.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:55 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:55.306 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:56.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.062 229152 DEBUG nova.compute.manager [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-unplugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.062 229152 DEBUG oslo_concurrency.lockutils [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.063 229152 DEBUG oslo_concurrency.lockutils [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.063 229152 DEBUG oslo_concurrency.lockutils [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.063 229152 DEBUG nova.compute.manager [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-unplugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.063 229152 WARNING nova.compute.manager [req-7fb0e63e-51fd-4233-b788-bd99c8e58a2e req-452e2d30-6eac-45fa-a67d-e6562b347ee3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-unplugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f for instance with vm_state active and task_state None.#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.065 229152 DEBUG nova.compute.manager [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-deleted-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.065 229152 INFO nova.compute.manager [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Neutron deleted interface 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f; detaching it from the instance and deleting it from the info cache#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.066 229152 DEBUG nova.network.neutron [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.110 229152 DEBUG nova.objects.instance [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lazy-loading 'system_metadata' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.112 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.139 229152 DEBUG nova.objects.instance [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lazy-loading 'flavor' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.163 229152 DEBUG nova.virt.libvirt.vif [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.163 229152 DEBUG nova.network.os_vif_util [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.164 229152 DEBUG nova.network.os_vif_util [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.166 229152 DEBUG nova.virt.libvirt.guest [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.169 229152 DEBUG nova.virt.libvirt.guest [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface>not found in domain: <domain type='kvm' id='4'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <name>instance-00000006</name>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <uuid>c46b48c6-f17b-46fd-909f-65cf07dab4e6</uuid>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:21:54</nova:creationTime>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='serial'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='uuid'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk' index='2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config' index='1'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:0c:ce:19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='tapc9f60650-13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <rng model='virtio'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <backend model='random'>/dev/urandom</backend>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='rng0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <label>system_u:system_r:svirt_t:s0:c307,c925</label>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c925</imagelabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <label>+107:+107</label>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <imagelabel>+107:+107</imagelabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.171 229152 DEBUG nova.virt.libvirt.guest [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.174 229152 DEBUG nova.virt.libvirt.guest [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:2f:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap61c72b1f-ae"/></interface>not found in domain: <domain type='kvm' id='4'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <name>instance-00000006</name>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <uuid>c46b48c6-f17b-46fd-909f-65cf07dab4e6</uuid>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:21:54</nova:creationTime>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <memory unit='KiB'>131072</memory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <vcpu placement='static'>1</vcpu>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <resource>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <partition>/machine</partition>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </resource>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <sysinfo type='smbios'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='manufacturer'>RDO</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='product'>OpenStack Compute</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='serial'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='uuid'>c46b48c6-f17b-46fd-909f-65cf07dab4e6</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <entry name='family'>Virtual Machine</entry>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <boot dev='hd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <smbios mode='sysinfo'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <vmcoreinfo state='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <cpu mode='custom' match='exact' check='full'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <vendor>AMD</vendor>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='x2apic'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc-deadline'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='hypervisor'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='tsc_adjust'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='spec-ctrl'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='stibp'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='cmp_legacy'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='overflow-recov'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='succor'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='ibrs'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='amd-ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='virt-ssbd'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='lbrv'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='tsc-scale'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='vmcb-clean'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='flushbyasid'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pause-filter'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='pfthreshold'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='xsaves'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='svm'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='require' name='topoext'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='npt'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <feature policy='disable' name='nrip-save'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <clock offset='utc'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='pit' tickpolicy='delay'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <timer name='hpet' present='no'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_poweroff>destroy</on_poweroff>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_reboot>restart</on_reboot>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <on_crash>destroy</on_crash>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <disk type='network' device='disk'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk' index='2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='vda' bus='virtio'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='virtio-disk0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <disk type='network' device='cdrom'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='qemu' type='raw' cache='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <auth username='openstack'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <secret type='ceph' uuid='365f19c2-81e5-5edd-b6b4-280555214d3a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source protocol='rbd' name='vms/c46b48c6-f17b-46fd-909f-65cf07dab4e6_disk.config' index='1'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.100' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.102' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <host name='192.168.122.101' port='6789'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='sda' bus='sata'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <readonly/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='sata0-0-0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='0' model='pcie-root'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pcie.0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='1' port='0x10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='2' port='0x11'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='3' port='0x12'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='4' port='0x13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='5' port='0x14'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='6' port='0x15'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='7' port='0x16'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='8' port='0x17'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.8'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='9' port='0x18'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.9'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='10' port='0x19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='11' port='0x1a'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.11'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='12' port='0x1b'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.12'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='13' port='0x1c'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='14' port='0x1d'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.14'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='15' port='0x1e'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.15'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='16' port='0x1f'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.16'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='17' port='0x20'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.17'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='18' port='0x21'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.18'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='19' port='0x22'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='20' port='0x23'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.20'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='21' port='0x24'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.21'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='22' port='0x25'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.22'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='23' port='0x26'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.23'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='24' port='0x27'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.24'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-root-port'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target chassis='25' port='0x28'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.25'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model name='pcie-pci-bridge'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='pci.26'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='usb'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <controller type='sata' index='0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='ide'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </controller>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <interface type='ethernet'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <mac address='fa:16:3e:0c:ce:19'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target dev='tapc9f60650-13'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model type='virtio'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <driver name='vhost' rx_queue_size='512'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <mtu size='1442'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='net0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <serial type='pty'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target type='isa-serial' port='0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:        <model name='isa-serial'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      </target>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <console type='pty' tty='/dev/pts/0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <source path='/dev/pts/0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <log file='/var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6/console.log' append='off'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <target type='serial' port='0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='serial0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </console>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='tablet' bus='usb'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='usb' bus='0' port='1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='mouse' bus='ps2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input1'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <input type='keyboard' bus='ps2'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='input2'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </input>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <listen type='address' address='::0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </graphics>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <audio id='1' type='none'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <model type='virtio' heads='1' primary='yes'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='video0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <watchdog model='itco' action='reset'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='watchdog0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </watchdog>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <memballoon model='virtio'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <stats period='10'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='balloon0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <rng model='virtio'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <backend model='random'>/dev/urandom</backend>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <alias name='rng0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <label>system_u:system_r:svirt_t:s0:c307,c925</label>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c307,c925</imagelabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <label>+107:+107</label>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <imagelabel>+107:+107</imagelabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </seclabel>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.175 229152 WARNING nova.virt.libvirt.driver [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Detaching interface fa:16:3e:71:2f:0e failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap61c72b1f-ae' not found.#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.176 229152 DEBUG nova.virt.libvirt.vif [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.176 229152 DEBUG nova.network.os_vif_util [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.177 229152 DEBUG nova.network.os_vif_util [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.177 229152 DEBUG os_vif [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.179 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.179 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61c72b1f-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.180 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.182 229152 INFO os_vif [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae')#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.183 229152 DEBUG nova.virt.libvirt.guest [req-05808a91-c7de-4683-a42f-5a66ff2dfeb8 req-068013b5-b734-4da5-9f31-e1d6380ad82a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:name>tempest-TestNetworkBasicOps-server-1386308587</nova:name>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:creationTime>2025-12-01 10:21:57</nova:creationTime>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:flavor name="m1.nano">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:memory>128</nova:memory>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:disk>1</nova:disk>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:swap>0</nova:swap>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:vcpus>1</nova:vcpus>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:flavor>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:owner>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  <nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    <nova:port uuid="c9f60650-13a9-4fee-a03c-4878c9ea3a07">
Dec  1 05:21:57 np0005540826 nova_compute[229148]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:    </nova:port>
Dec  1 05:21:57 np0005540826 nova_compute[229148]:  </nova:ports>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: </nova:instance>
Dec  1 05:21:57 np0005540826 nova_compute[229148]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  1 05:21:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:21:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:57.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.458 229152 DEBUG nova.network.neutron [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.481 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.482 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.483 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.484 229152 DEBUG nova.network.neutron [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.486 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.488 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.488 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.488 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.489 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.489 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.490 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.511 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.512 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.512 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.512 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.512 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:21:57 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:57Z|00081|binding|INFO|Releasing lport f9f66464-5468-467b-973f-59839c1856fe from this chassis (sb_readonly=0)
Dec  1 05:21:57 np0005540826 nova_compute[229148]: 2025-12-01 10:21:57.574 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:21:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3813778074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.072 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.138 229152 DEBUG nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.139 229152 DEBUG nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.296 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.297 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4742MB free_disk=59.94247817993164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.297 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.298 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.397 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Instance c46b48c6-f17b-46fd-909f-65cf07dab4e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.397 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.397 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.404 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.405 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.405 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.405 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.406 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.407 229152 INFO nova.compute.manager [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Terminating instance#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.408 229152 DEBUG nova.compute.manager [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.424 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.451 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.451 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.475 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.519 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:21:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:58.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:58 np0005540826 kernel: tapc9f60650-13 (unregistering): left promiscuous mode
Dec  1 05:21:58 np0005540826 NetworkManager[48989]: <info>  [1764584518.5785] device (tapc9f60650-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:21:58 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:58Z|00082|binding|INFO|Releasing lport c9f60650-13a9-4fee-a03c-4878c9ea3a07 from this chassis (sb_readonly=0)
Dec  1 05:21:58 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:58Z|00083|binding|INFO|Setting lport c9f60650-13a9-4fee-a03c-4878c9ea3a07 down in Southbound
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.586 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 ovn_controller[132309]: 2025-12-01T10:21:58Z|00084|binding|INFO|Removing iface tapc9f60650-13 ovn-installed in OVS
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.595 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:ce:19 10.100.0.10'], port_security=['fa:16:3e:0c:ce:19 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c46b48c6-f17b-46fd-909f-65cf07dab4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4e3600e5-3d5c-4f84-844a-9271184774dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5dbb3e99-00d5-4ffd-aeb7-01c272921789', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52a2e477-8890-44bf-b68f-dad9b43df531, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=c9f60650-13a9-4fee-a03c-4878c9ea3a07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.596 141685 INFO neutron.agent.ovn.metadata.agent [-] Port c9f60650-13a9-4fee-a03c-4878c9ea3a07 in datapath 4e3600e5-3d5c-4f84-844a-9271184774dd unbound from our chassis#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.597 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4e3600e5-3d5c-4f84-844a-9271184774dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.597 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[f3eac2f7-abf7-451b-afa4-9bdff924ced4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.598 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd namespace which is not needed anymore#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.604 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.606 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:21:58 np0005540826 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec  1 05:21:58 np0005540826 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 18.295s CPU time.
Dec  1 05:21:58 np0005540826 systemd-machined[192474]: Machine qemu-4-instance-00000006 terminated.
Dec  1 05:21:58 np0005540826 podman[239670]: 2025-12-01 10:21:58.677920887 +0000 UTC m=+0.061101213 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  1 05:21:58 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [NOTICE]   (238753) : haproxy version is 2.8.14-c23fe91
Dec  1 05:21:58 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [NOTICE]   (238753) : path to executable is /usr/sbin/haproxy
Dec  1 05:21:58 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [WARNING]  (238753) : Exiting Master process...
Dec  1 05:21:58 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [ALERT]    (238753) : Current worker (238755) exited with code 143 (Terminated)
Dec  1 05:21:58 np0005540826 neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd[238749]: [WARNING]  (238753) : All workers exited. Exiting... (0)
Dec  1 05:21:58 np0005540826 systemd[1]: libpod-f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b.scope: Deactivated successfully.
Dec  1 05:21:58 np0005540826 podman[239712]: 2025-12-01 10:21:58.732838336 +0000 UTC m=+0.041695931 container died f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 05:21:58 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b-userdata-shm.mount: Deactivated successfully.
Dec  1 05:21:58 np0005540826 systemd[1]: var-lib-containers-storage-overlay-7708c844f0fb33f55d88e6d9cfa28c7cc086bd7f5d6a6e81922c9692b15632f0-merged.mount: Deactivated successfully.
Dec  1 05:21:58 np0005540826 podman[239712]: 2025-12-01 10:21:58.772167694 +0000 UTC m=+0.081025289 container cleanup f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:21:58 np0005540826 systemd[1]: libpod-conmon-f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b.scope: Deactivated successfully.
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.829 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 podman[239760]: 2025-12-01 10:21:58.835132966 +0000 UTC m=+0.041176488 container remove f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.837 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.846 229152 INFO nova.virt.libvirt.driver [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Instance destroyed successfully.#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.847 229152 DEBUG nova.objects.instance [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'resources' on Instance uuid c46b48c6-f17b-46fd-909f-65cf07dab4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.847 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[eb906866-21e9-4d7c-94ce-a656b7a49786]: (4, ('Mon Dec  1 10:21:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd (f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b)\nf8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b\nMon Dec  1 10:21:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd (f8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b)\nf8747e53949ad8ccd95fc5a41e9368fcf6d3784e7d1e3c4849e5ec069359915b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.850 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[eacea285-cc70-4069-8b37-a07a597d93df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.851 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e3600e5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.853 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 kernel: tap4e3600e5-30: left promiscuous mode
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.876 229152 DEBUG nova.virt.libvirt.vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.877 229152 DEBUG nova.network.os_vif_util [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.878 229152 DEBUG nova.network.os_vif_util [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.879 229152 DEBUG os_vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.880 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.881 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9f60650-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.913 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.916 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.917 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff09908-25a0-478b-8156-86192a6228c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.924 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.927 229152 INFO os_vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:ce:19,bridge_name='br-int',has_traffic_filtering=True,id=c9f60650-13a9-4fee-a03c-4878c9ea3a07,network=Network(4e3600e5-3d5c-4f84-844a-9271184774dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9f60650-13')#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.929 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[913a6055-9100-4e6c-9065-619e748d5cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.930 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5de756-fb4b-457b-8d90-d5b5e65d9cf0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.927 229152 DEBUG nova.virt.libvirt.vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:19:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1386308587',display_name='tempest-TestNetworkBasicOps-server-1386308587',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1386308587',id=6,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNlsicanYXytlXTgYTmaMFboTco/f673p0Przv+d4K28ZijgatIkOSXNpjiz5VeubRvfxH8C0Wp0yqZF71ydwtKBTEW7G7vvokxNdm8j7TByXnmiUBYXpHHmHVf8HImEjA==',key_name='tempest-TestNetworkBasicOps-553113242',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:20:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-djnfhl6c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:20:04Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=c46b48c6-f17b-46fd-909f-65cf07dab4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.928 229152 DEBUG nova.network.os_vif_util [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "address": "fa:16:3e:71:2f:0e", "network": {"id": "0e5b3de9-56f5-4f4d-87c1-c01596567748", "bridge": "br-int", "label": "tempest-network-smoke--1903173849", "subnets": [], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c72b1f-ae", "ovs_interfaceid": "61c72b1f-aea9-4ec0-80c5-03c2e87aad8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.928 229152 DEBUG nova.network.os_vif_util [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.928 229152 DEBUG os_vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.931 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.932 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61c72b1f-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.932 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:21:58 np0005540826 nova_compute[229148]: 2025-12-01 10:21:58.933 229152 INFO os_vif [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:2f:0e,bridge_name='br-int',has_traffic_filtering=True,id=61c72b1f-aea9-4ec0-80c5-03c2e87aad8f,network=Network(0e5b3de9-56f5-4f4d-87c1-c01596567748),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c72b1f-ae')#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.948 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[3db1621e-f437-420a-aba1-8cfb558d13c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426559, 'reachable_time': 31355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239785, 'error': None, 'target': 'ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.951 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4e3600e5-3d5c-4f84-844a-9271184774dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:21:58 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:21:58.951 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[299fe175-00f7-4d13-ae26-8196a1895ba0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:21:58 np0005540826 systemd[1]: run-netns-ovnmeta\x2d4e3600e5\x2d3d5c\x2d4f84\x2d844a\x2d9271184774dd.mount: Deactivated successfully.
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.055 229152 INFO nova.network.neutron [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Port 61c72b1f-aea9-4ec0-80c5-03c2e87aad8f from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.056 229152 DEBUG nova.network.neutron [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.073 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:21:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:21:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1203186481' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.115 229152 DEBUG oslo_concurrency.lockutils [None req-09bdf7db-8c31-40d7-bddf-4997f2924694 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "interface-c46b48c6-f17b-46fd-909f-65cf07dab4e6-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.117 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.124 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.148 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.149 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.149 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.191 229152 DEBUG nova.compute.manager [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.191 229152 DEBUG nova.compute.manager [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing instance network info cache due to event network-changed-c9f60650-13a9-4fee-a03c-4878c9ea3a07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.192 229152 DEBUG oslo_concurrency.lockutils [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.192 229152 DEBUG oslo_concurrency.lockutils [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.192 229152 DEBUG nova.network.neutron [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Refreshing network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.217 229152 DEBUG nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.217 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.218 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.218 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.218 229152 DEBUG nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.218 229152 WARNING nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-plugged-61c72b1f-aea9-4ec0-80c5-03c2e87aad8f for instance with vm_state active and task_state deleting.#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-unplugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG oslo_concurrency.lockutils [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-unplugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.219 229152 DEBUG nova.compute.manager [req-c610e4a4-86e4-47c1-bb0a-f4902ac2c6c7 req-c004f451-90ee-4f51-9f5d-8c7250b801dd dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-unplugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  1 05:21:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:21:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:59.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:59 np0005540826 nova_compute[229148]: 2025-12-01 10:21:59.715 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.334 229152 DEBUG nova.network.neutron [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updated VIF entry in instance network info cache for port c9f60650-13a9-4fee-a03c-4878c9ea3a07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.334 229152 DEBUG nova.network.neutron [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [{"id": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "address": "fa:16:3e:0c:ce:19", "network": {"id": "4e3600e5-3d5c-4f84-844a-9271184774dd", "bridge": "br-int", "label": "tempest-network-smoke--432110090", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9f60650-13", "ovs_interfaceid": "c9f60650-13a9-4fee-a03c-4878c9ea3a07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.357 229152 DEBUG oslo_concurrency.lockutils [req-3dfa5b87-cc0b-4b5d-a847-719be3580373 req-1a9800c8-2c46-4d91-820d-6eb8b133f059 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-c46b48c6-f17b-46fd-909f-65cf07dab4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.414 229152 INFO nova.virt.libvirt.driver [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Deleting instance files /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6_del#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.415 229152 INFO nova.virt.libvirt.driver [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Deletion of /var/lib/nova/instances/c46b48c6-f17b-46fd-909f-65cf07dab4e6_del complete#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.478 229152 INFO nova.compute.manager [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Took 2.07 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.479 229152 DEBUG oslo.service.loopingcall [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.479 229152 DEBUG nova.compute.manager [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.479 229152 DEBUG nova.network.neutron [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 05:22:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:00.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.774 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:00 np0005540826 nova_compute[229148]: 2025-12-01 10:22:00.775 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:01.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.286 229152 DEBUG nova.compute.manager [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.287 229152 DEBUG oslo_concurrency.lockutils [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.287 229152 DEBUG oslo_concurrency.lockutils [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.288 229152 DEBUG oslo_concurrency.lockutils [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.288 229152 DEBUG nova.compute.manager [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] No waiting events found dispatching network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:22:01 np0005540826 nova_compute[229148]: 2025-12-01 10:22:01.288 229152 WARNING nova.compute.manager [req-396c81b0-2096-42ba-971e-b5e5ff538c1e req-ee679164-9a1b-45bd-b211-9ed03dcfcb6f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received unexpected event network-vif-plugged-c9f60650-13a9-4fee-a03c-4878c9ea3a07 for instance with vm_state active and task_state deleting.#033[00m
Dec  1 05:22:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:02.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:02 np0005540826 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 05:22:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:03.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:03 np0005540826 nova_compute[229148]: 2025-12-01 10:22:03.913 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:04.553 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:04.554 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:04.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.629 229152 DEBUG nova.network.neutron [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.655 229152 INFO nova.compute.manager [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Took 4.18 seconds to deallocate network for instance.#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.698 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.699 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.718 229152 DEBUG nova.compute.manager [req-070013c8-ce82-46af-b284-903a9922fef5 req-4aff0fef-09c6-49c8-a058-1102bf742abc dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Received event network-vif-deleted-c9f60650-13a9-4fee-a03c-4878c9ea3a07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.719 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:04 np0005540826 nova_compute[229148]: 2025-12-01 10:22:04.749 229152 DEBUG oslo_concurrency.processutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/421893025' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.220 229152 DEBUG oslo_concurrency.processutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.226 229152 DEBUG nova.compute.provider_tree [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:22:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:05.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.240 229152 DEBUG nova.scheduler.client.report [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.262 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.294 229152 INFO nova.scheduler.client.report [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Deleted allocations for instance c46b48c6-f17b-46fd-909f-65cf07dab4e6#033[00m
Dec  1 05:22:05 np0005540826 nova_compute[229148]: 2025-12-01 10:22:05.345 229152 DEBUG oslo_concurrency.lockutils [None req-8968846d-7dd8-4b72-b931-8ede11f3af85 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "c46b48c6-f17b-46fd-909f-65cf07dab4e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:06.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:07.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:08.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:08 np0005540826 nova_compute[229148]: 2025-12-01 10:22:08.945 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:09 np0005540826 nova_compute[229148]: 2025-12-01 10:22:09.220 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:09.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:09 np0005540826 nova_compute[229148]: 2025-12-01 10:22:09.291 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:09 np0005540826 nova_compute[229148]: 2025-12-01 10:22:09.719 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:10.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:11.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:12.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:13.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:13 np0005540826 nova_compute[229148]: 2025-12-01 10:22:13.844 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584518.8421378, c46b48c6-f17b-46fd-909f-65cf07dab4e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:22:13 np0005540826 nova_compute[229148]: 2025-12-01 10:22:13.844 229152 INFO nova.compute.manager [-] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:22:13 np0005540826 nova_compute[229148]: 2025-12-01 10:22:13.884 229152 DEBUG nova.compute.manager [None req-a2e745ca-e0b7-4495-8ae1-9c55a9b3c7c6 - - - - - -] [instance: c46b48c6-f17b-46fd-909f-65cf07dab4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:13 np0005540826 nova_compute[229148]: 2025-12-01 10:22:13.947 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:14.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:14 np0005540826 nova_compute[229148]: 2025-12-01 10:22:14.722 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:15.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:15 np0005540826 podman[239866]: 2025-12-01 10:22:15.99653408 +0000 UTC m=+0.064917833 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:22:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:16.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:17.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:18.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:18 np0005540826 nova_compute[229148]: 2025-12-01 10:22:18.949 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:19.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:19 np0005540826 nova_compute[229148]: 2025-12-01 10:22:19.723 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.003000079s ======
Dec  1 05:22:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:20.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Dec  1 05:22:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:21.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:22.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:23.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:23 np0005540826 nova_compute[229148]: 2025-12-01 10:22:23.951 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:24.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:24 np0005540826 nova_compute[229148]: 2025-12-01 10:22:24.726 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:24 np0005540826 podman[239892]: 2025-12-01 10:22:24.993796081 +0000 UTC m=+0.080411333 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:22:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:25.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:26.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:26 np0005540826 nova_compute[229148]: 2025-12-01 10:22:26.986 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:26 np0005540826 nova_compute[229148]: 2025-12-01 10:22:26.986 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.013 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.095 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.096 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.102 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.102 229152 INFO nova.compute.claims [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.211 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:27.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:27 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:27 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/238511788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.685 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.694 229152 DEBUG nova.compute.provider_tree [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.711 229152 DEBUG nova.scheduler.client.report [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.738 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.739 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.796 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.797 229152 DEBUG nova.network.neutron [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.818 229152 INFO nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.835 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.926 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.928 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.929 229152 INFO nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Creating image(s)#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.958 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:27 np0005540826 nova_compute[229148]: 2025-12-01 10:22:27.985 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.011 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.015 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.093 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.094 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.095 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.095 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.121 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.125 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 60827f60-9be9-4785-89b4-96d2a13acd60_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.221 229152 DEBUG nova.policy [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:22:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:28.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.610 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 60827f60-9be9-4785-89b4-96d2a13acd60_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:28 np0005540826 nova_compute[229148]: 2025-12-01 10:22:28.689 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.017 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.070 229152 DEBUG nova.objects.instance [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid 60827f60-9be9-4785-89b4-96d2a13acd60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.092 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.092 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Ensure instance console log exists: /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.093 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.093 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.093 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:29 np0005540826 podman[240116]: 2025-12-01 10:22:29.101920642 +0000 UTC m=+0.047956707 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:22:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:29.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.435 229152 DEBUG nova.network.neutron [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Successfully updated port: a18935ec-0bdc-41b0-9e52-6e3919b1ede3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.453 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.453 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.454 229152 DEBUG nova.network.neutron [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.561 229152 DEBUG nova.compute.manager [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-changed-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.561 229152 DEBUG nova.compute.manager [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Refreshing instance network info cache due to event network-changed-a18935ec-0bdc-41b0-9e52-6e3919b1ede3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.562 229152 DEBUG oslo_concurrency.lockutils [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.727 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:29 np0005540826 nova_compute[229148]: 2025-12-01 10:22:29.813 229152 DEBUG nova.network.neutron [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.486 229152 DEBUG nova.network.neutron [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updating instance_info_cache with network_info: [{"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.503 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.504 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Instance network_info: |[{"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.504 229152 DEBUG oslo_concurrency.lockutils [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.505 229152 DEBUG nova.network.neutron [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Refreshing network info cache for port a18935ec-0bdc-41b0-9e52-6e3919b1ede3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.508 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Start _get_guest_xml network_info=[{"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.512 229152 WARNING nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.517 229152 DEBUG nova.virt.libvirt.host [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.517 229152 DEBUG nova.virt.libvirt.host [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.525 229152 DEBUG nova.virt.libvirt.host [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.526 229152 DEBUG nova.virt.libvirt.host [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.527 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.527 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.528 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.529 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.529 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.529 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.529 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.530 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.530 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.531 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.531 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.531 229152 DEBUG nova.virt.hardware [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.536 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:30.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:22:30 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2667232419' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:22:30 np0005540826 nova_compute[229148]: 2025-12-01 10:22:30.982 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.010 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.015 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:31.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:22:31 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/761622790' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.497 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.499 229152 DEBUG nova.virt.libvirt.vif [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-265461962',display_name='tempest-TestNetworkBasicOps-server-265461962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-265461962',id=8,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWUDl7Ca7PFjMDqatDjDzP+MrULe+8d7/8ygK8VWLub4JVkFJbBSYVM4G5JdW5K14Jsm/7uUv6P4RSF9sw4sal7fKj0kOrUNJcfmqaTM/e9p8fmKwhe/r8xlpvlLI5PaQ==',key_name='tempest-TestNetworkBasicOps-1942288070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ixb0yd98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:22:27Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=60827f60-9be9-4785-89b4-96d2a13acd60,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.500 229152 DEBUG nova.network.os_vif_util [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.501 229152 DEBUG nova.network.os_vif_util [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.502 229152 DEBUG nova.objects.instance [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 60827f60-9be9-4785-89b4-96d2a13acd60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.516 229152 DEBUG nova.network.neutron [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updated VIF entry in instance network info cache for port a18935ec-0bdc-41b0-9e52-6e3919b1ede3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.516 229152 DEBUG nova.network.neutron [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updating instance_info_cache with network_info: [{"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.521 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <uuid>60827f60-9be9-4785-89b4-96d2a13acd60</uuid>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <name>instance-00000008</name>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-265461962</nova:name>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:22:30</nova:creationTime>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <nova:port uuid="a18935ec-0bdc-41b0-9e52-6e3919b1ede3">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="serial">60827f60-9be9-4785-89b4-96d2a13acd60</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="uuid">60827f60-9be9-4785-89b4-96d2a13acd60</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/60827f60-9be9-4785-89b4-96d2a13acd60_disk">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/60827f60-9be9-4785-89b4-96d2a13acd60_disk.config">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:4d:c6:8f"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <target dev="tapa18935ec-0b"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/console.log" append="off"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:22:31 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:22:31 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:22:31 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:22:31 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.523 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Preparing to wait for external event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.523 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.524 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.524 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.525 229152 DEBUG nova.virt.libvirt.vif [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-265461962',display_name='tempest-TestNetworkBasicOps-server-265461962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-265461962',id=8,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWUDl7Ca7PFjMDqatDjDzP+MrULe+8d7/8ygK8VWLub4JVkFJbBSYVM4G5JdW5K14Jsm/7uUv6P4RSF9sw4sal7fKj0kOrUNJcfmqaTM/e9p8fmKwhe/r8xlpvlLI5PaQ==',key_name='tempest-TestNetworkBasicOps-1942288070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ixb0yd98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:22:27Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=60827f60-9be9-4785-89b4-96d2a13acd60,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.525 229152 DEBUG nova.network.os_vif_util [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.526 229152 DEBUG nova.network.os_vif_util [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.527 229152 DEBUG os_vif [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.527 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.528 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.529 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.531 229152 DEBUG oslo_concurrency.lockutils [req-1c107054-a618-4009-aafa-0b700e39217b req-d053ac61-e090-46ad-9e38-2c16646683a1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.534 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.534 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa18935ec-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.534 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa18935ec-0b, col_values=(('external_ids', {'iface-id': 'a18935ec-0bdc-41b0-9e52-6e3919b1ede3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:c6:8f', 'vm-uuid': '60827f60-9be9-4785-89b4-96d2a13acd60'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.536 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:31 np0005540826 NetworkManager[48989]: <info>  [1764584551.5376] manager: (tapa18935ec-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.539 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.542 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.543 229152 INFO os_vif [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b')#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.601 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.601 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.602 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:4d:c6:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.602 229152 INFO nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Using config drive#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.627 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.946 229152 INFO nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Creating config drive at /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config#033[00m
Dec  1 05:22:31 np0005540826 nova_compute[229148]: 2025-12-01 10:22:31.951 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl0nckmfq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.079 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl0nckmfq" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.107 229152 DEBUG nova.storage.rbd_utils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 60827f60-9be9-4785-89b4-96d2a13acd60_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.111 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config 60827f60-9be9-4785-89b4-96d2a13acd60_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:32.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.621 229152 DEBUG oslo_concurrency.processutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config 60827f60-9be9-4785-89b4-96d2a13acd60_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.623 229152 INFO nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Deleting local config drive /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60/disk.config because it was imported into RBD.#033[00m
Dec  1 05:22:32 np0005540826 systemd[1]: Starting libvirt secret daemon...
Dec  1 05:22:32 np0005540826 systemd[1]: Started libvirt secret daemon.
Dec  1 05:22:32 np0005540826 kernel: tapa18935ec-0b: entered promiscuous mode
Dec  1 05:22:32 np0005540826 NetworkManager[48989]: <info>  [1764584552.7284] manager: (tapa18935ec-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Dec  1 05:22:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:32Z|00085|binding|INFO|Claiming lport a18935ec-0bdc-41b0-9e52-6e3919b1ede3 for this chassis.
Dec  1 05:22:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:32Z|00086|binding|INFO|a18935ec-0bdc-41b0-9e52-6e3919b1ede3: Claiming fa:16:3e:4d:c6:8f 10.100.0.7
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.728 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.736 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.747 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:c6:8f 10.100.0.7'], port_security=['fa:16:3e:4d:c6:8f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1647305629', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '60827f60-9be9-4785-89b4-96d2a13acd60', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-434ae97b-0a30-409f-b9ad-87922177cfc0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1647305629', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '11501149-732d-4202-ad97-ece49baad0dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b8707eb-b5e9-4720-9f51-1840140506cb, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=a18935ec-0bdc-41b0-9e52-6e3919b1ede3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.749 141685 INFO neutron.agent.ovn.metadata.agent [-] Port a18935ec-0bdc-41b0-9e52-6e3919b1ede3 in datapath 434ae97b-0a30-409f-b9ad-87922177cfc0 bound to our chassis#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.750 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 434ae97b-0a30-409f-b9ad-87922177cfc0#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.763 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1675df8a-d0d7-4848-a19e-4f02a771a8bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.764 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap434ae97b-01 in ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.767 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap434ae97b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.767 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[cbcc6979-8921-459a-83e1-61367593bcaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.768 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[73afb3bc-299c-4c70-8f7d-cb24a514c29c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 systemd-machined[192474]: New machine qemu-5-instance-00000008.
Dec  1 05:22:32 np0005540826 systemd-udevd[240314]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.781 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[03437e4e-6a3e-463f-8d19-ef01cba039fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 NetworkManager[48989]: <info>  [1764584552.7928] device (tapa18935ec-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:22:32 np0005540826 NetworkManager[48989]: <info>  [1764584552.7937] device (tapa18935ec-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.795 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:32 np0005540826 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.798 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e24b72-79b6-4671-bc82-1dc00e172876]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:32Z|00087|binding|INFO|Setting lport a18935ec-0bdc-41b0-9e52-6e3919b1ede3 ovn-installed in OVS
Dec  1 05:22:32 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:32Z|00088|binding|INFO|Setting lport a18935ec-0bdc-41b0-9e52-6e3919b1ede3 up in Southbound
Dec  1 05:22:32 np0005540826 nova_compute[229148]: 2025-12-01 10:22:32.804 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.829 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[382bca10-606c-4429-bcdd-a1384ccd7cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.835 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[74a52c7f-604a-4627-bc0e-b9388d6e8fb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 NetworkManager[48989]: <info>  [1764584552.8365] manager: (tap434ae97b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Dec  1 05:22:32 np0005540826 systemd-udevd[240317]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.870 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[50ffe86a-9d53-479d-8c3c-7026d0583e17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.873 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4c342d-aeda-48e2-b832-d31fc7b6bd85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 NetworkManager[48989]: <info>  [1764584552.8967] device (tap434ae97b-00): carrier: link connected
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.902 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[4f62440b-62b2-4734-a84f-ae834e43326a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.917 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[574008fc-fdc1-416d-946d-0f53cc1c88f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap434ae97b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:51:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441541, 'reachable_time': 33308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240345, 'error': None, 'target': 'ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.930 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[30330241-06e5-4c5c-976c-9fc7d5dc7a62]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:51ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441541, 'tstamp': 441541}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240346, 'error': None, 'target': 'ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.947 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a840f0-49c9-457f-bee8-5f4a3f67faff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap434ae97b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:51:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441541, 'reachable_time': 33308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240347, 'error': None, 'target': 'ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:32 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:32.974 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0a6a89-8ffb-419c-9218-81eff4dd7e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.037 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3f3478-9f9f-484b-9fc3-7b3bbbb5cd40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.039 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap434ae97b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.039 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.040 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap434ae97b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:33 np0005540826 kernel: tap434ae97b-00: entered promiscuous mode
Dec  1 05:22:33 np0005540826 NetworkManager[48989]: <info>  [1764584553.0425] manager: (tap434ae97b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.042 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.046 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap434ae97b-00, col_values=(('external_ids', {'iface-id': '45d4e66b-9979-4881-8973-52e53617afe5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:33Z|00089|binding|INFO|Releasing lport 45d4e66b-9979-4881-8973-52e53617afe5 from this chassis (sb_readonly=0)
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.047 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.048 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.049 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/434ae97b-0a30-409f-b9ad-87922177cfc0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/434ae97b-0a30-409f-b9ad-87922177cfc0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.050 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[f5014a30-27c6-4358-ae8e-3907df9c35dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.050 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-434ae97b-0a30-409f-b9ad-87922177cfc0
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/434ae97b-0a30-409f-b9ad-87922177cfc0.pid.haproxy
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 434ae97b-0a30-409f-b9ad-87922177cfc0
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:22:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:33.051 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0', 'env', 'PROCESS_TAG=haproxy-434ae97b-0a30-409f-b9ad-87922177cfc0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/434ae97b-0a30-409f-b9ad-87922177cfc0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.061 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.265 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584553.2650342, 60827f60-9be9-4785-89b4-96d2a13acd60 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.266 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] VM Started (Lifecycle Event)#033[00m
Dec  1 05:22:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:33.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.292 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.297 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584553.2652805, 60827f60-9be9-4785-89b4-96d2a13acd60 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.297 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.315 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.319 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:22:33 np0005540826 nova_compute[229148]: 2025-12-01 10:22:33.338 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:22:33 np0005540826 podman[240421]: 2025-12-01 10:22:33.445136536 +0000 UTC m=+0.052462315 container create 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  1 05:22:33 np0005540826 systemd[1]: Started libpod-conmon-5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca.scope.
Dec  1 05:22:33 np0005540826 podman[240421]: 2025-12-01 10:22:33.416455989 +0000 UTC m=+0.023781798 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:22:33 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:22:33 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa4dfc3534ccabeed5ae0df17b78c24e33c2e5ecd559ef65e7adf15020a6b8d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:22:33 np0005540826 podman[240421]: 2025-12-01 10:22:33.548980116 +0000 UTC m=+0.156305915 container init 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:22:33 np0005540826 podman[240421]: 2025-12-01 10:22:33.555309483 +0000 UTC m=+0.162635262 container start 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  1 05:22:33 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [NOTICE]   (240440) : New worker (240442) forked
Dec  1 05:22:33 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [NOTICE]   (240440) : Loading success.
Dec  1 05:22:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:34.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.730 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.776 229152 DEBUG nova.compute.manager [req-7bb691b9-6f3b-4592-a771-8ef2640ed185 req-e8870e2b-9357-4af2-b0de-7d639bf4a9a0 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.777 229152 DEBUG oslo_concurrency.lockutils [req-7bb691b9-6f3b-4592-a771-8ef2640ed185 req-e8870e2b-9357-4af2-b0de-7d639bf4a9a0 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.777 229152 DEBUG oslo_concurrency.lockutils [req-7bb691b9-6f3b-4592-a771-8ef2640ed185 req-e8870e2b-9357-4af2-b0de-7d639bf4a9a0 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.777 229152 DEBUG oslo_concurrency.lockutils [req-7bb691b9-6f3b-4592-a771-8ef2640ed185 req-e8870e2b-9357-4af2-b0de-7d639bf4a9a0 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.778 229152 DEBUG nova.compute.manager [req-7bb691b9-6f3b-4592-a771-8ef2640ed185 req-e8870e2b-9357-4af2-b0de-7d639bf4a9a0 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Processing event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.778 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.782 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584554.7821035, 60827f60-9be9-4785-89b4-96d2a13acd60 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.782 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.784 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.787 229152 INFO nova.virt.libvirt.driver [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Instance spawned successfully.#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.788 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.803 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.809 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.812 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.813 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.813 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.813 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.814 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.814 229152 DEBUG nova.virt.libvirt.driver [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.840 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.874 229152 INFO nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Took 6.95 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.875 229152 DEBUG nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.945 229152 INFO nova.compute.manager [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Took 7.88 seconds to build instance.#033[00m
Dec  1 05:22:34 np0005540826 nova_compute[229148]: 2025-12-01 10:22:34.961 229152 DEBUG oslo_concurrency.lockutils [None req-63225c94-e201-4fc2-93fd-4f19649aee9d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:35.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:36 np0005540826 nova_compute[229148]: 2025-12-01 10:22:36.537 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:36.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.002 229152 DEBUG nova.compute.manager [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.002 229152 DEBUG oslo_concurrency.lockutils [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.003 229152 DEBUG oslo_concurrency.lockutils [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.003 229152 DEBUG oslo_concurrency.lockutils [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.003 229152 DEBUG nova.compute.manager [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] No waiting events found dispatching network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:22:37 np0005540826 nova_compute[229148]: 2025-12-01 10:22:37.003 229152 WARNING nova.compute.manager [req-e1d7b37d-b9ab-4584-825d-4dd63684101f req-c9b5d3cf-460e-4414-8c97-bbf2b917b776 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received unexpected event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:22:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:37.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:38.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:39.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:39 np0005540826 nova_compute[229148]: 2025-12-01 10:22:39.764 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:39 np0005540826 NetworkManager[48989]: <info>  [1764584559.9345] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec  1 05:22:39 np0005540826 nova_compute[229148]: 2025-12-01 10:22:39.933 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:39 np0005540826 NetworkManager[48989]: <info>  [1764584559.9365] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Dec  1 05:22:39 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:39Z|00090|binding|INFO|Releasing lport 45d4e66b-9979-4881-8973-52e53617afe5 from this chassis (sb_readonly=0)
Dec  1 05:22:39 np0005540826 nova_compute[229148]: 2025-12-01 10:22:39.968 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:39 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:39Z|00091|binding|INFO|Releasing lport 45d4e66b-9979-4881-8973-52e53617afe5 from this chassis (sb_readonly=0)
Dec  1 05:22:39 np0005540826 nova_compute[229148]: 2025-12-01 10:22:39.973 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.252 229152 DEBUG nova.compute.manager [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-changed-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.253 229152 DEBUG nova.compute.manager [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Refreshing instance network info cache due to event network-changed-a18935ec-0bdc-41b0-9e52-6e3919b1ede3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.253 229152 DEBUG oslo_concurrency.lockutils [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.254 229152 DEBUG oslo_concurrency.lockutils [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.254 229152 DEBUG nova.network.neutron [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Refreshing network info cache for port a18935ec-0bdc-41b0-9e52-6e3919b1ede3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.474 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.475 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.475 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.476 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.476 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.478 229152 INFO nova.compute.manager [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Terminating instance#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.479 229152 DEBUG nova.compute.manager [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 05:22:40 np0005540826 kernel: tapa18935ec-0b (unregistering): left promiscuous mode
Dec  1 05:22:40 np0005540826 NetworkManager[48989]: <info>  [1764584560.5353] device (tapa18935ec-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:22:40 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:40Z|00092|binding|INFO|Releasing lport a18935ec-0bdc-41b0-9e52-6e3919b1ede3 from this chassis (sb_readonly=0)
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.542 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:40Z|00093|binding|INFO|Setting lport a18935ec-0bdc-41b0-9e52-6e3919b1ede3 down in Southbound
Dec  1 05:22:40 np0005540826 ovn_controller[132309]: 2025-12-01T10:22:40Z|00094|binding|INFO|Removing iface tapa18935ec-0b ovn-installed in OVS
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.551 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:c6:8f 10.100.0.7'], port_security=['fa:16:3e:4d:c6:8f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1647305629', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '60827f60-9be9-4785-89b4-96d2a13acd60', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-434ae97b-0a30-409f-b9ad-87922177cfc0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1647305629', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11501149-732d-4202-ad97-ece49baad0dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b8707eb-b5e9-4720-9f51-1840140506cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=a18935ec-0bdc-41b0-9e52-6e3919b1ede3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.552 141685 INFO neutron.agent.ovn.metadata.agent [-] Port a18935ec-0bdc-41b0-9e52-6e3919b1ede3 in datapath 434ae97b-0a30-409f-b9ad-87922177cfc0 unbound from our chassis#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.553 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 434ae97b-0a30-409f-b9ad-87922177cfc0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.554 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a484e33b-d5c9-4957-b83d-216fce60e6da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.554 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0 namespace which is not needed anymore#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.562 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec  1 05:22:40 np0005540826 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 6.137s CPU time.
Dec  1 05:22:40 np0005540826 systemd-machined[192474]: Machine qemu-5-instance-00000008 terminated.
Dec  1 05:22:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:40.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:40 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [NOTICE]   (240440) : haproxy version is 2.8.14-c23fe91
Dec  1 05:22:40 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [NOTICE]   (240440) : path to executable is /usr/sbin/haproxy
Dec  1 05:22:40 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [WARNING]  (240440) : Exiting Master process...
Dec  1 05:22:40 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [ALERT]    (240440) : Current worker (240442) exited with code 143 (Terminated)
Dec  1 05:22:40 np0005540826 neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0[240436]: [WARNING]  (240440) : All workers exited. Exiting... (0)
Dec  1 05:22:40 np0005540826 systemd[1]: libpod-5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca.scope: Deactivated successfully.
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.719 229152 INFO nova.virt.libvirt.driver [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Instance destroyed successfully.#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.719 229152 DEBUG nova.objects.instance [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'resources' on Instance uuid 60827f60-9be9-4785-89b4-96d2a13acd60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:22:40 np0005540826 podman[240481]: 2025-12-01 10:22:40.720546201 +0000 UTC m=+0.058970537 container died 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.750 229152 DEBUG nova.virt.libvirt.vif [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-265461962',display_name='tempest-TestNetworkBasicOps-server-265461962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-265461962',id=8,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWUDl7Ca7PFjMDqatDjDzP+MrULe+8d7/8ygK8VWLub4JVkFJbBSYVM4G5JdW5K14Jsm/7uUv6P4RSF9sw4sal7fKj0kOrUNJcfmqaTM/e9p8fmKwhe/r8xlpvlLI5PaQ==',key_name='tempest-TestNetworkBasicOps-1942288070',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:22:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ixb0yd98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:22:34Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=60827f60-9be9-4785-89b4-96d2a13acd60,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:22:40 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca-userdata-shm.mount: Deactivated successfully.
Dec  1 05:22:40 np0005540826 systemd[1]: var-lib-containers-storage-overlay-fa4dfc3534ccabeed5ae0df17b78c24e33c2e5ecd559ef65e7adf15020a6b8d6-merged.mount: Deactivated successfully.
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.757 229152 DEBUG nova.network.os_vif_util [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.758 229152 DEBUG nova.network.os_vif_util [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.759 229152 DEBUG os_vif [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.762 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.763 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa18935ec-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.769 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.772 229152 INFO os_vif [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:c6:8f,bridge_name='br-int',has_traffic_filtering=True,id=a18935ec-0bdc-41b0-9e52-6e3919b1ede3,network=Network(434ae97b-0a30-409f-b9ad-87922177cfc0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa18935ec-0b')#033[00m
Dec  1 05:22:40 np0005540826 podman[240481]: 2025-12-01 10:22:40.774213548 +0000 UTC m=+0.112637884 container cleanup 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:22:40 np0005540826 systemd[1]: libpod-conmon-5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca.scope: Deactivated successfully.
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.825 229152 DEBUG nova.compute.manager [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-vif-unplugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.825 229152 DEBUG oslo_concurrency.lockutils [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.825 229152 DEBUG oslo_concurrency.lockutils [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.826 229152 DEBUG oslo_concurrency.lockutils [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.826 229152 DEBUG nova.compute.manager [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] No waiting events found dispatching network-vif-unplugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.827 229152 DEBUG nova.compute.manager [req-90f77456-a446-4ace-bd75-03e833fe4dbb req-c5e4d0b7-2692-4129-baa2-ca6f05b2409a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-vif-unplugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  1 05:22:40 np0005540826 podman[240526]: 2025-12-01 10:22:40.847228764 +0000 UTC m=+0.046193030 container remove 5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.853 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[49bb0d4c-690e-4832-97c6-3ee83d6fb487]: (4, ('Mon Dec  1 10:22:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0 (5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca)\n5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca\nMon Dec  1 10:22:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0 (5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca)\n5d6473296d0ab431bd07c257b3d4b38e4facced4b70de4cce14bafac47186fca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.855 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[d05b9d86-ea5e-4f27-8969-539294b60744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.856 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap434ae97b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:22:40 np0005540826 kernel: tap434ae97b-00: left promiscuous mode
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.901 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.903 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[097ac417-9b7e-417e-a90f-2ea7209480a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 nova_compute[229148]: 2025-12-01 10:22:40.916 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.921 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9d80a626-d729-4e9c-8603-f499be84423e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.922 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[c45fe4f3-c82b-42d1-9f18-7d2435bc9235]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.937 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4f03b5-f97e-4a57-ac07-ec9d90443b98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441534, 'reachable_time': 20689, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240550, 'error': None, 'target': 'ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:40 np0005540826 systemd[1]: run-netns-ovnmeta\x2d434ae97b\x2d0a30\x2d409f\x2db9ad\x2d87922177cfc0.mount: Deactivated successfully.
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.940 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-434ae97b-0a30-409f-b9ad-87922177cfc0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:22:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:40.940 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[15af7a66-eaa1-44ba-a6bc-2a1223222742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.168 229152 INFO nova.virt.libvirt.driver [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Deleting instance files /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60_del#033[00m
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.169 229152 INFO nova.virt.libvirt.driver [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Deletion of /var/lib/nova/instances/60827f60-9be9-4785-89b4-96d2a13acd60_del complete#033[00m
Dec  1 05:22:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:41.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.792 229152 INFO nova.compute.manager [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Took 1.31 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.794 229152 DEBUG oslo.service.loopingcall [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.795 229152 DEBUG nova.compute.manager [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 05:22:41 np0005540826 nova_compute[229148]: 2025-12-01 10:22:41.796 229152 DEBUG nova.network.neutron [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.161 229152 DEBUG nova.network.neutron [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updated VIF entry in instance network info cache for port a18935ec-0bdc-41b0-9e52-6e3919b1ede3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.162 229152 DEBUG nova.network.neutron [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updating instance_info_cache with network_info: [{"id": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "address": "fa:16:3e:4d:c6:8f", "network": {"id": "434ae97b-0a30-409f-b9ad-87922177cfc0", "bridge": "br-int", "label": "tempest-network-smoke--141806258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18935ec-0b", "ovs_interfaceid": "a18935ec-0bdc-41b0-9e52-6e3919b1ede3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.184 229152 DEBUG oslo_concurrency.lockutils [req-62a45692-5525-40e6-9750-f75dcc064327 req-9a44a6db-992a-4f55-8d7b-6264334eb17a dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-60827f60-9be9-4785-89b4-96d2a13acd60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:22:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:42.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.907 229152 DEBUG nova.compute.manager [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.908 229152 DEBUG oslo_concurrency.lockutils [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.909 229152 DEBUG oslo_concurrency.lockutils [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.909 229152 DEBUG oslo_concurrency.lockutils [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.909 229152 DEBUG nova.compute.manager [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] No waiting events found dispatching network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:22:42 np0005540826 nova_compute[229148]: 2025-12-01 10:22:42.910 229152 WARNING nova.compute.manager [req-2fe0ee76-44fa-454b-9e52-29449e795cc3 req-c7384925-3885-4623-a412-bf3d8928c382 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Received unexpected event network-vif-plugged-a18935ec-0bdc-41b0-9e52-6e3919b1ede3 for instance with vm_state active and task_state deleting.#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.147 229152 DEBUG nova.network.neutron [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.162 229152 INFO nova.compute.manager [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Took 1.37 seconds to deallocate network for instance.#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.219 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.220 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.261 229152 DEBUG oslo_concurrency.processutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:43.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:43 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:43 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1338427231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.894 229152 DEBUG oslo_concurrency.processutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.633s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.903 229152 DEBUG nova.compute.provider_tree [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.923 229152 DEBUG nova.scheduler.client.report [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.948 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:43 np0005540826 nova_compute[229148]: 2025-12-01 10:22:43.973 229152 INFO nova.scheduler.client.report [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Deleted allocations for instance 60827f60-9be9-4785-89b4-96d2a13acd60#033[00m
Dec  1 05:22:44 np0005540826 nova_compute[229148]: 2025-12-01 10:22:44.058 229152 DEBUG oslo_concurrency.lockutils [None req-b47f9ae4-660d-45f9-a0c0-f0d0e8d8f8a7 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "60827f60-9be9-4785-89b4-96d2a13acd60" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:44.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:44 np0005540826 nova_compute[229148]: 2025-12-01 10:22:44.768 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:45 np0005540826 nova_compute[229148]: 2025-12-01 10:22:45.767 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:46 np0005540826 podman[240625]: 2025-12-01 10:22:46.186291865 +0000 UTC m=+0.076359986 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd)
Dec  1 05:22:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:46.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:47.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:22:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:48.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:49.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:49 np0005540826 nova_compute[229148]: 2025-12-01 10:22:49.769 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:50.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:50 np0005540826 nova_compute[229148]: 2025-12-01 10:22:50.780 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:51.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:52.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:53 np0005540826 nova_compute[229148]: 2025-12-01 10:22:53.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:53 np0005540826 nova_compute[229148]: 2025-12-01 10:22:53.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:22:53 np0005540826 nova_compute[229148]: 2025-12-01 10:22:53.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:22:53 np0005540826 nova_compute[229148]: 2025-12-01 10:22:53.140 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:22:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:22:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:53.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:22:53 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:53 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:54 np0005540826 nova_compute[229148]: 2025-12-01 10:22:54.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:54.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:54 np0005540826 nova_compute[229148]: 2025-12-01 10:22:54.771 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:55 np0005540826 nova_compute[229148]: 2025-12-01 10:22:55.718 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584560.7163484, 60827f60-9be9-4785-89b4-96d2a13acd60 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:22:55 np0005540826 nova_compute[229148]: 2025-12-01 10:22:55.718 229152 INFO nova.compute.manager [-] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:22:55 np0005540826 nova_compute[229148]: 2025-12-01 10:22:55.747 229152 DEBUG nova.compute.manager [None req-23396ace-90a3-46d2-9688-a5081990f831 - - - - - -] [instance: 60827f60-9be9-4785-89b4-96d2a13acd60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:22:55 np0005540826 nova_compute[229148]: 2025-12-01 10:22:55.783 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:56 np0005540826 podman[240800]: 2025-12-01 10:22:56.023120778 +0000 UTC m=+0.101901570 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  1 05:22:56 np0005540826 nova_compute[229148]: 2025-12-01 10:22:56.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:56 np0005540826 nova_compute[229148]: 2025-12-01 10:22:56.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:22:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:56.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:22:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:57.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:58 np0005540826 nova_compute[229148]: 2025-12-01 10:22:58.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:58.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.146 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.147 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:59 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:59.282 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.282 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:59 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:22:59.284 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:22:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:22:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:59.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1200241194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.641 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:59 np0005540826 podman[240851]: 2025-12-01 10:22:59.759367416 +0000 UTC m=+0.062993433 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.773 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.845 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.846 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4886MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.847 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.847 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.924 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.924 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:22:59 np0005540826 nova_compute[229148]: 2025-12-01 10:22:59.941 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.258105) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580259127, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1634, "num_deletes": 502, "total_data_size": 3416593, "memory_usage": 3472760, "flush_reason": "Manual Compaction"}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580275839, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2094203, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29530, "largest_seqno": 31159, "table_properties": {"data_size": 2087844, "index_size": 3049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17625, "raw_average_key_size": 19, "raw_value_size": 2072866, "raw_average_value_size": 2334, "num_data_blocks": 132, "num_entries": 888, "num_filter_entries": 888, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584476, "oldest_key_time": 1764584476, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 16881 microseconds, and 5800 cpu microseconds.
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.275916) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2094203 bytes OK
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.275943) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.277542) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.277557) EVENT_LOG_v1 {"time_micros": 1764584580277553, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.277579) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3408037, prev total WAL file size 3408037, number of live WAL files 2.
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.278756) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2045KB)], [57(16MB)]
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580278855, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18940179, "oldest_snapshot_seqno": -1}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5817 keys, 12756732 bytes, temperature: kUnknown
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580340243, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12756732, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12719622, "index_size": 21448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150321, "raw_average_key_size": 25, "raw_value_size": 12616249, "raw_average_value_size": 2168, "num_data_blocks": 859, "num_entries": 5817, "num_filter_entries": 5817, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.340563) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12756732 bytes
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.343828) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 308.1 rd, 207.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.1 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.1) write-amplify(6.1) OK, records in: 6833, records dropped: 1016 output_compression: NoCompression
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.343853) EVENT_LOG_v1 {"time_micros": 1764584580343840, "job": 34, "event": "compaction_finished", "compaction_time_micros": 61474, "compaction_time_cpu_micros": 29350, "output_level": 6, "num_output_files": 1, "total_output_size": 12756732, "num_input_records": 6833, "num_output_records": 5817, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580344342, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580346999, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.278599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.347105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.347112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.347114) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.347115) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:23:00.347117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:23:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1107339476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.403 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.409 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.472 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.498 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.498 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:23:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:00.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:00 np0005540826 nova_compute[229148]: 2025-12-01 10:23:00.786 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:01.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:02.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:03 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:23:03.286 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:23:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:03.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:03 np0005540826 nova_compute[229148]: 2025-12-01 10:23:03.494 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:23:04.554 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:23:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:23:04.555 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:23:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:23:04.555 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:23:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:04.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:04 np0005540826 nova_compute[229148]: 2025-12-01 10:23:04.815 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:05.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:05 np0005540826 nova_compute[229148]: 2025-12-01 10:23:05.789 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:07.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:08.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:09.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:09 np0005540826 nova_compute[229148]: 2025-12-01 10:23:09.817 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:10 np0005540826 nova_compute[229148]: 2025-12-01 10:23:10.848 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:11.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:13.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:14.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:14 np0005540826 nova_compute[229148]: 2025-12-01 10:23:14.819 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:15.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:15 np0005540826 nova_compute[229148]: 2025-12-01 10:23:15.851 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:16.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:16 np0005540826 nova_compute[229148]: 2025-12-01 10:23:16.706 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:16 np0005540826 nova_compute[229148]: 2025-12-01 10:23:16.779 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:16 np0005540826 podman[240930]: 2025-12-01 10:23:16.978344513 +0000 UTC m=+0.059941862 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:23:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:17.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:18.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:19.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:19 np0005540826 nova_compute[229148]: 2025-12-01 10:23:19.821 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:20.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:20 np0005540826 nova_compute[229148]: 2025-12-01 10:23:20.854 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:21.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:23.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:24.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:24 np0005540826 nova_compute[229148]: 2025-12-01 10:23:24.823 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:25.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:25 np0005540826 nova_compute[229148]: 2025-12-01 10:23:25.857 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:26 np0005540826 podman[240978]: 2025-12-01 10:23:26.207019109 +0000 UTC m=+0.107686843 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:23:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:26.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:27.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:28.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:29.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:29 np0005540826 nova_compute[229148]: 2025-12-01 10:23:29.825 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:29 np0005540826 podman[241007]: 2025-12-01 10:23:29.970227702 +0000 UTC m=+0.056221101 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  1 05:23:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:30.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:30 np0005540826 nova_compute[229148]: 2025-12-01 10:23:30.859 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:31.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:33.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:34.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:34 np0005540826 nova_compute[229148]: 2025-12-01 10:23:34.827 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:35.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:35 np0005540826 nova_compute[229148]: 2025-12-01 10:23:35.862 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:36.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:37.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:38.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:39.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:39 np0005540826 nova_compute[229148]: 2025-12-01 10:23:39.828 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:40.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:40 np0005540826 nova_compute[229148]: 2025-12-01 10:23:40.865 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:41.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:42.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:43.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:44.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:44 np0005540826 nova_compute[229148]: 2025-12-01 10:23:44.831 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:45.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:45 np0005540826 nova_compute[229148]: 2025-12-01 10:23:45.868 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:46.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:47.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:47 np0005540826 podman[241061]: 2025-12-01 10:23:47.96616709 +0000 UTC m=+0.050829199 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:23:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:48.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:49.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:49 np0005540826 nova_compute[229148]: 2025-12-01 10:23:49.832 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:50.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:50 np0005540826 nova_compute[229148]: 2025-12-01 10:23:50.871 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:51.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:51 np0005540826 ovn_controller[132309]: 2025-12-01T10:23:51Z|00095|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec  1 05:23:52 np0005540826 nova_compute[229148]: 2025-12-01 10:23:52.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:52.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:53 np0005540826 nova_compute[229148]: 2025-12-01 10:23:53.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:53 np0005540826 nova_compute[229148]: 2025-12-01 10:23:53.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:23:53 np0005540826 nova_compute[229148]: 2025-12-01 10:23:53.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:23:53 np0005540826 nova_compute[229148]: 2025-12-01 10:23:53.129 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:23:53 np0005540826 podman[241207]: 2025-12-01 10:23:53.31167915 +0000 UTC m=+0.060378964 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:23:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:53.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:53 np0005540826 podman[241207]: 2025-12-01 10:23:53.420550484 +0000 UTC m=+0.169250278 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:23:53 np0005540826 podman[241323]: 2025-12-01 10:23:53.868555316 +0000 UTC m=+0.055169606 container exec b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:23:53 np0005540826 podman[241323]: 2025-12-01 10:23:53.875275261 +0000 UTC m=+0.061889521 container exec_died b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:23:54 np0005540826 nova_compute[229148]: 2025-12-01 10:23:54.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:54 np0005540826 podman[241458]: 2025-12-01 10:23:54.529000346 +0000 UTC m=+0.202719550 container exec 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:23:54 np0005540826 podman[241458]: 2025-12-01 10:23:54.700485208 +0000 UTC m=+0.374204412 container exec_died 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:23:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:54.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:54 np0005540826 nova_compute[229148]: 2025-12-01 10:23:54.834 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:55 np0005540826 podman[241525]: 2025-12-01 10:23:55.272082736 +0000 UTC m=+0.163822005 container exec b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, name=keepalived, release=1793, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=2.2.4, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  1 05:23:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:55.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:55 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:55 np0005540826 podman[241547]: 2025-12-01 10:23:55.577371803 +0000 UTC m=+0.284979720 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, vendor=Red Hat, Inc., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, description=keepalived for Ceph, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived)
Dec  1 05:23:55 np0005540826 podman[241525]: 2025-12-01 10:23:55.58458595 +0000 UTC m=+0.476325239 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., description=keepalived for Ceph, name=keepalived, release=1793, distribution-scope=public)
Dec  1 05:23:55 np0005540826 nova_compute[229148]: 2025-12-01 10:23:55.874 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:23:56 np0005540826 nova_compute[229148]: 2025-12-01 10:23:56.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:56.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:56 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:23:57 np0005540826 podman[241642]: 2025-12-01 10:23:57.011733908 +0000 UTC m=+0.091350664 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:23:57 np0005540826 nova_compute[229148]: 2025-12-01 10:23:57.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:57.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:58 np0005540826 ceph-mon[80026]: Health check failed: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  1 05:23:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:58 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:23:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:58.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:59 np0005540826 nova_compute[229148]: 2025-12-01 10:23:59.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:59 np0005540826 nova_compute[229148]: 2025-12-01 10:23:59.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:23:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:23:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:23:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:59.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:23:59 np0005540826 nova_compute[229148]: 2025-12-01 10:23:59.836 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:00 np0005540826 nova_compute[229148]: 2025-12-01 10:24:00.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:00.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:00 np0005540826 nova_compute[229148]: 2025-12-01 10:24:00.878 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:00 np0005540826 podman[241670]: 2025-12-01 10:24:00.964943044 +0000 UTC m=+0.052478260 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.138 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.138 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.138 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:01.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:24:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3653343762' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.630 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.778 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.779 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4945MB free_disk=59.9428825378418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.779 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.780 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.837 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.838 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:24:01 np0005540826 nova_compute[229148]: 2025-12-01 10:24:01.856 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:24:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2023787751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:24:02 np0005540826 nova_compute[229148]: 2025-12-01 10:24:02.331 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:02 np0005540826 nova_compute[229148]: 2025-12-01 10:24:02.338 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:24:02 np0005540826 nova_compute[229148]: 2025-12-01 10:24:02.353 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:24:02 np0005540826 nova_compute[229148]: 2025-12-01 10:24:02.355 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:24:02 np0005540826 nova_compute[229148]: 2025-12-01 10:24:02.355 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:02.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:03 np0005540826 nova_compute[229148]: 2025-12-01 10:24:03.351 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:03.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:03 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:04.555 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:04.556 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:04.556 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:04.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:04 np0005540826 nova_compute[229148]: 2025-12-01 10:24:04.837 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:05.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:05 np0005540826 nova_compute[229148]: 2025-12-01 10:24:05.881 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:06.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:07.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:09.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:09 np0005540826 nova_compute[229148]: 2025-12-01 10:24:09.840 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:10.170 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:24:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:10.171 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:24:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:24:10.171 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:24:10 np0005540826 nova_compute[229148]: 2025-12-01 10:24:10.171 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:10 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:10.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:10 np0005540826 nova_compute[229148]: 2025-12-01 10:24:10.884 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:11.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:12.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:13.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:14.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:14 np0005540826 nova_compute[229148]: 2025-12-01 10:24:14.841 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:15.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:15 np0005540826 nova_compute[229148]: 2025-12-01 10:24:15.887 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:16.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:17.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:18.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:18 np0005540826 podman[241795]: 2025-12-01 10:24:18.984214842 +0000 UTC m=+0.060540337 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:24:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:19.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:19 np0005540826 nova_compute[229148]: 2025-12-01 10:24:19.893 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:20.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:20 np0005540826 nova_compute[229148]: 2025-12-01 10:24:20.890 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:21.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:22.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:23.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:24.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:24 np0005540826 nova_compute[229148]: 2025-12-01 10:24:24.896 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:25.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:25 np0005540826 nova_compute[229148]: 2025-12-01 10:24:25.893 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:26.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:27.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:28 np0005540826 podman[241844]: 2025-12-01 10:24:28.02286431 +0000 UTC m=+0.108041734 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:24:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:28.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:29.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:29 np0005540826 nova_compute[229148]: 2025-12-01 10:24:29.898 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:30.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:30 np0005540826 nova_compute[229148]: 2025-12-01 10:24:30.896 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:31 np0005540826 podman[241874]: 2025-12-01 10:24:31.970106198 +0000 UTC m=+0.053298809 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:24:32 np0005540826 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 05:24:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:32.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:33.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:34.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:34 np0005540826 nova_compute[229148]: 2025-12-01 10:24:34.899 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:35.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:35 np0005540826 nova_compute[229148]: 2025-12-01 10:24:35.898 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:36.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:37.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:38.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:39 np0005540826 nova_compute[229148]: 2025-12-01 10:24:39.901 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:40.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:40 np0005540826 nova_compute[229148]: 2025-12-01 10:24:40.901 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:41.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:42.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:43.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:44.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:44 np0005540826 nova_compute[229148]: 2025-12-01 10:24:44.933 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:45.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:45 np0005540826 nova_compute[229148]: 2025-12-01 10:24:45.904 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:46.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:47.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:48.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:49.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:49 np0005540826 nova_compute[229148]: 2025-12-01 10:24:49.935 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:49 np0005540826 podman[241928]: 2025-12-01 10:24:49.975928237 +0000 UTC m=+0.058995470 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:24:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:50.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:50 np0005540826 nova_compute[229148]: 2025-12-01 10:24:50.905 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:51.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:52.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:24:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:53.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:24:54 np0005540826 nova_compute[229148]: 2025-12-01 10:24:54.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:54 np0005540826 nova_compute[229148]: 2025-12-01 10:24:54.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:24:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:54.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:54 np0005540826 nova_compute[229148]: 2025-12-01 10:24:54.937 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.124 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.125 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.125 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.139 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.140 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:55.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:55 np0005540826 nova_compute[229148]: 2025-12-01 10:24:55.909 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:24:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:56.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:57.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.159 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.159 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.173 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.373 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.376 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.384 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.385 229152 INFO nova.compute.claims [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:24:58 np0005540826 nova_compute[229148]: 2025-12-01 10:24:58.583 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:58.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:59 np0005540826 podman[241973]: 2025-12-01 10:24:59.00746985 +0000 UTC m=+0.088523475 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:24:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:24:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1287002598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.044 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.051 229152 DEBUG nova.compute.provider_tree [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.068 229152 DEBUG nova.scheduler.client.report [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.095 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.096 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.109 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.135 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.136 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.159 229152 INFO nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.176 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.264 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.266 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.266 229152 INFO nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Creating image(s)#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.293 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.321 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.348 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.352 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.412 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.413 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.413 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.414 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.442 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.446 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:24:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:59.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.767 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.849 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.912 229152 DEBUG nova.policy [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:24:59 np0005540826 nova_compute[229148]: 2025-12-01 10:24:59.996 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.002 229152 DEBUG nova.objects.instance [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.015 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.015 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Ensure instance console log exists: /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.016 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.016 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.017 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.622 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Successfully created port: 91698a91-5908-4580-acd7-7dd9246226da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:25:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:00.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:00 np0005540826 nova_compute[229148]: 2025-12-01 10:25:00.912 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:01 np0005540826 nova_compute[229148]: 2025-12-01 10:25:01.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:01.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:01 np0005540826 nova_compute[229148]: 2025-12-01 10:25:01.989 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Successfully updated port: 91698a91-5908-4580-acd7-7dd9246226da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.012 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.012 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.012 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.090 229152 DEBUG nova.compute.manager [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-changed-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.090 229152 DEBUG nova.compute.manager [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing instance network info cache due to event network-changed-91698a91-5908-4580-acd7-7dd9246226da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.090 229152 DEBUG oslo_concurrency.lockutils [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:02 np0005540826 nova_compute[229148]: 2025-12-01 10:25:02.149 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:25:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:02.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:02 np0005540826 podman[242169]: 2025-12-01 10:25:02.970915327 +0000 UTC m=+0.050513971 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.132 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.133 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.133 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.133 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.133 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:03.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:25:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3703375154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.584 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.725 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.726 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4878MB free_disk=59.92213439941406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.727 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.727 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.783 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Instance 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.784 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.785 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.816 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.870 229152 DEBUG nova.network.neutron [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updating instance_info_cache with network_info: [{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.889 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.889 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance network_info: |[{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.889 229152 DEBUG oslo_concurrency.lockutils [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.890 229152 DEBUG nova.network.neutron [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing network info cache for port 91698a91-5908-4580-acd7-7dd9246226da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.892 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Start _get_guest_xml network_info=[{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.902 229152 WARNING nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.912 229152 DEBUG nova.virt.libvirt.host [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.913 229152 DEBUG nova.virt.libvirt.host [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.917 229152 DEBUG nova.virt.libvirt.host [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.917 229152 DEBUG nova.virt.libvirt.host [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.918 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.918 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.918 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.919 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.919 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.919 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.919 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.920 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.920 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.921 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.921 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.921 229152 DEBUG nova.virt.hardware [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:25:03 np0005540826 nova_compute[229148]: 2025-12-01 10:25:03.925 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3128697381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4127506147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.378 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.403 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.407 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.428 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.612s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.434 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.454 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.474 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.475 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:04.557 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:04.557 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:04.557 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:04.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/243036720' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.887 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.889 229152 DEBUG nova.virt.libvirt.vif [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:24:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-647821466',display_name='tempest-TestNetworkBasicOps-server-647821466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-647821466',id=12,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbjeoaxilM6CVu2R4EOzPCQGLqTAidZHcCxX3A7PaMSz81VzCcIIzJSrd792r/+wPgmu9RKsVWEi1wOWY+i4v+Ebg5QtWRH9mDnj21+8JWacv4KnerkrFTP+ktXsozcmA==',key_name='tempest-TestNetworkBasicOps-1274436416',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ldupxg6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:24:59Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=2fd6ef3f-65f1-4b07-8cb6-cf04d7943853,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.889 229152 DEBUG nova.network.os_vif_util [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.890 229152 DEBUG nova.network.os_vif_util [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.891 229152 DEBUG nova.objects.instance [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.908 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <uuid>2fd6ef3f-65f1-4b07-8cb6-cf04d7943853</uuid>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <name>instance-0000000c</name>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-647821466</nova:name>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:25:03</nova:creationTime>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <nova:port uuid="91698a91-5908-4580-acd7-7dd9246226da">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="serial">2fd6ef3f-65f1-4b07-8cb6-cf04d7943853</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="uuid">2fd6ef3f-65f1-4b07-8cb6-cf04d7943853</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:ca:92:34"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <target dev="tap91698a91-59"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/console.log" append="off"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:25:04 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:25:04 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:25:04 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:25:04 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.910 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Preparing to wait for external event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.911 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.911 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.912 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.912 229152 DEBUG nova.virt.libvirt.vif [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:24:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-647821466',display_name='tempest-TestNetworkBasicOps-server-647821466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-647821466',id=12,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbjeoaxilM6CVu2R4EOzPCQGLqTAidZHcCxX3A7PaMSz81VzCcIIzJSrd792r/+wPgmu9RKsVWEi1wOWY+i4v+Ebg5QtWRH9mDnj21+8JWacv4KnerkrFTP+ktXsozcmA==',key_name='tempest-TestNetworkBasicOps-1274436416',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ldupxg6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:24:59Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=2fd6ef3f-65f1-4b07-8cb6-cf04d7943853,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.913 229152 DEBUG nova.network.os_vif_util [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.914 229152 DEBUG nova.network.os_vif_util [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.914 229152 DEBUG os_vif [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.915 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.916 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.916 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.920 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.921 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91698a91-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.921 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap91698a91-59, col_values=(('external_ids', {'iface-id': '91698a91-5908-4580-acd7-7dd9246226da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:92:34', 'vm-uuid': '2fd6ef3f-65f1-4b07-8cb6-cf04d7943853'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.923 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:04 np0005540826 NetworkManager[48989]: <info>  [1764584704.9240] manager: (tap91698a91-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.925 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.931 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.932 229152 INFO os_vif [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59')#033[00m
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:04 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.974 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.985 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.986 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.986 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:ca:92:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:25:04 np0005540826 nova_compute[229148]: 2025-12-01 10:25:04.987 229152 INFO nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Using config drive#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.014 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.020 229152 DEBUG nova.network.neutron [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updated VIF entry in instance network info cache for port 91698a91-5908-4580-acd7-7dd9246226da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.021 229152 DEBUG nova.network.neutron [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updating instance_info_cache with network_info: [{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.053 229152 DEBUG oslo_concurrency.lockutils [req-7f12bef4-c367-48e0-86f2-bb0d71a9c813 req-5b5cfaa6-bcb9-4570-acb4-7098ac03cd71 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.307 229152 INFO nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Creating config drive at /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.312 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdkvtsd03 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.438 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdkvtsd03" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.468 229152 DEBUG nova.storage.rbd_utils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.472 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:05.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.620 229152 DEBUG oslo_concurrency.processutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.621 229152 INFO nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Deleting local config drive /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853/disk.config because it was imported into RBD.#033[00m
Dec  1 05:25:05 np0005540826 systemd[1]: Starting libvirt secret daemon...
Dec  1 05:25:05 np0005540826 systemd[1]: Started libvirt secret daemon.
Dec  1 05:25:05 np0005540826 kernel: tap91698a91-59: entered promiscuous mode
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.7069] manager: (tap91698a91-59): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.705 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.713 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:05Z|00096|binding|INFO|Claiming lport 91698a91-5908-4580-acd7-7dd9246226da for this chassis.
Dec  1 05:25:05 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:05Z|00097|binding|INFO|91698a91-5908-4580-acd7-7dd9246226da: Claiming fa:16:3e:ca:92:34 10.100.0.7
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.716 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.718 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.7210] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.7218] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.726 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:92:34 10.100.0.7'], port_security=['fa:16:3e:ca:92:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2fd6ef3f-65f1-4b07-8cb6-cf04d7943853', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4ef8ee7a-5876-4532-b61e-5a17605fc3b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd915a5f-666a-4c2a-9612-6191ae438030, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=91698a91-5908-4580-acd7-7dd9246226da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.727 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 91698a91-5908-4580-acd7-7dd9246226da in datapath 82ec8f83-684f-44ae-8389-122bf8ed45ab bound to our chassis#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.728 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 82ec8f83-684f-44ae-8389-122bf8ed45ab#033[00m
Dec  1 05:25:05 np0005540826 systemd-machined[192474]: New machine qemu-6-instance-0000000c.
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.741 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[284af219-9b8e-439d-afcf-ac9e5037b49d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.745 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap82ec8f83-61 in ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.746 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap82ec8f83-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.746 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[381826b7-c2cc-4a3d-8ba1-afafdee330e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.747 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[090b87ef-db4e-433e-a605-e46a2bab70c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.760 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[95a12803-6c43-4048-a33b-c8136ca4ea4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.783 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6f9c35-1e9c-4e5d-bc33-115348e1e7c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.796 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 systemd-udevd[242472]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.806 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:05Z|00098|binding|INFO|Setting lport 91698a91-5908-4580-acd7-7dd9246226da ovn-installed in OVS
Dec  1 05:25:05 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:05Z|00099|binding|INFO|Setting lport 91698a91-5908-4580-acd7-7dd9246226da up in Southbound
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.815 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9b410a-c736-410b-be15-fa7843e2284a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 nova_compute[229148]: 2025-12-01 10:25:05.818 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.821 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[fe34c48c-8ebf-471d-9c7d-c73797c319a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.8217] manager: (tap82ec8f83-60): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Dec  1 05:25:05 np0005540826 systemd-udevd[242474]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.8257] device (tap91698a91-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.8266] device (tap91698a91-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.855 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[579ca23e-24d6-40ed-b7f6-a3bcce429087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.858 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[bbdeeb94-4e6e-49ea-97be-47dd1bab5896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 NetworkManager[48989]: <info>  [1764584705.8815] device (tap82ec8f83-60): carrier: link connected
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.886 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[bcab3d60-c987-4452-84eb-7995b1ebee55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.906 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[18a1e867-e9a6-43f6-b05d-0b9f4ecc7212]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82ec8f83-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:e9:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456840, 'reachable_time': 15825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242500, 'error': None, 'target': 'ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.922 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[6a08a67d-620d-4271-b57a-2419d1f07b44]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:e912'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456840, 'tstamp': 456840}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242501, 'error': None, 'target': 'ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.942 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b61f25e4-75f9-4404-96e0-0db4180ca419]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap82ec8f83-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:e9:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456840, 'reachable_time': 15825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242502, 'error': None, 'target': 'ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:05 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:05.978 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff037af-8e89-49da-abbd-f6c5a8528604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.040 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[abc88ded-6d21-473c-b2ce-b78820f9414c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.041 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82ec8f83-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.042 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.042 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82ec8f83-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:06 np0005540826 kernel: tap82ec8f83-60: entered promiscuous mode
Dec  1 05:25:06 np0005540826 NetworkManager[48989]: <info>  [1764584706.0448] manager: (tap82ec8f83-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.044 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.047 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap82ec8f83-60, col_values=(('external_ids', {'iface-id': '0873d4b6-d57f-4e35-9752-e86556fac481'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.048 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.049 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:06 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:06Z|00100|binding|INFO|Releasing lport 0873d4b6-d57f-4e35-9752-e86556fac481 from this chassis (sb_readonly=0)
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.050 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/82ec8f83-684f-44ae-8389-122bf8ed45ab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/82ec8f83-684f-44ae-8389-122bf8ed45ab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.062 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[c9aa1e87-41f8-4997-bcb1-8779b3ce55e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.065 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-82ec8f83-684f-44ae-8389-122bf8ed45ab
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/82ec8f83-684f-44ae-8389-122bf8ed45ab.pid.haproxy
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 82ec8f83-684f-44ae-8389-122bf8ed45ab
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.065 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:06 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:06.066 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'env', 'PROCESS_TAG=haproxy-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/82ec8f83-684f-44ae-8389-122bf8ed45ab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.323 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584706.322751, 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.324 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] VM Started (Lifecycle Event)#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.347 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.353 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584706.3231626, 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.353 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.374 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.378 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:25:06 np0005540826 nova_compute[229148]: 2025-12-01 10:25:06.403 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:25:06 np0005540826 podman[242575]: 2025-12-01 10:25:06.440712661 +0000 UTC m=+0.049037795 container create d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  1 05:25:06 np0005540826 systemd[1]: Started libpod-conmon-d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4.scope.
Dec  1 05:25:06 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:25:06 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/037a1f9e4d31fa69e772a5e3b514ac2488de21de80f69623e8492434a4ec5585/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:25:06 np0005540826 podman[242575]: 2025-12-01 10:25:06.41626566 +0000 UTC m=+0.024590814 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:25:06 np0005540826 podman[242575]: 2025-12-01 10:25:06.51846503 +0000 UTC m=+0.126790194 container init d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:25:06 np0005540826 podman[242575]: 2025-12-01 10:25:06.524832637 +0000 UTC m=+0.133157801 container start d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:25:06 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [NOTICE]   (242603) : New worker (242620) forked
Dec  1 05:25:06 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [NOTICE]   (242603) : Loading success.
Dec  1 05:25:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:06.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.100 229152 DEBUG nova.compute.manager [req-ff5c024a-d748-4bee-ba8a-49765fd9841c req-242bd464-b716-4eb7-89a0-261653fa65f1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.100 229152 DEBUG oslo_concurrency.lockutils [req-ff5c024a-d748-4bee-ba8a-49765fd9841c req-242bd464-b716-4eb7-89a0-261653fa65f1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.101 229152 DEBUG oslo_concurrency.lockutils [req-ff5c024a-d748-4bee-ba8a-49765fd9841c req-242bd464-b716-4eb7-89a0-261653fa65f1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.101 229152 DEBUG oslo_concurrency.lockutils [req-ff5c024a-d748-4bee-ba8a-49765fd9841c req-242bd464-b716-4eb7-89a0-261653fa65f1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.101 229152 DEBUG nova.compute.manager [req-ff5c024a-d748-4bee-ba8a-49765fd9841c req-242bd464-b716-4eb7-89a0-261653fa65f1 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Processing event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.102 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.106 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584707.1060276, 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.106 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.109 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.113 229152 INFO nova.virt.libvirt.driver [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance spawned successfully.#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.113 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.134 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.138 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.141 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.142 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.142 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.143 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.143 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.143 229152 DEBUG nova.virt.libvirt.driver [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.175 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.206 229152 INFO nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Took 7.94 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.207 229152 DEBUG nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.266 229152 INFO nova.compute.manager [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Took 9.04 seconds to build instance.#033[00m
Dec  1 05:25:07 np0005540826 nova_compute[229148]: 2025-12-01 10:25:07.280 229152 DEBUG oslo_concurrency.lockutils [None req-4436dcae-7800-4b79-bfcf-756137187e73 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:07.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:08.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:09.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.664 229152 DEBUG nova.compute.manager [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.664 229152 DEBUG oslo_concurrency.lockutils [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.665 229152 DEBUG oslo_concurrency.lockutils [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.665 229152 DEBUG oslo_concurrency.lockutils [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.665 229152 DEBUG nova.compute.manager [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] No waiting events found dispatching network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.665 229152 WARNING nova.compute.manager [req-d9f5ee91-52f1-4563-a07e-54fb4a971757 req-2ca5881e-9c72-449d-9b74-5ca0eaa386df dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received unexpected event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da for instance with vm_state active and task_state None.#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.954 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:09 np0005540826 nova_compute[229148]: 2025-12-01 10:25:09.977 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:10 np0005540826 nova_compute[229148]: 2025-12-01 10:25:10.676 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:10 np0005540826 nova_compute[229148]: 2025-12-01 10:25:10.677 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:25:10 np0005540826 nova_compute[229148]: 2025-12-01 10:25:10.692 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:25:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:10.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:11.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:11 np0005540826 nova_compute[229148]: 2025-12-01 10:25:11.765 229152 DEBUG nova.compute.manager [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-changed-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:11 np0005540826 nova_compute[229148]: 2025-12-01 10:25:11.765 229152 DEBUG nova.compute.manager [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing instance network info cache due to event network-changed-91698a91-5908-4580-acd7-7dd9246226da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:25:11 np0005540826 nova_compute[229148]: 2025-12-01 10:25:11.766 229152 DEBUG oslo_concurrency.lockutils [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:25:11 np0005540826 nova_compute[229148]: 2025-12-01 10:25:11.766 229152 DEBUG oslo_concurrency.lockutils [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:25:11 np0005540826 nova_compute[229148]: 2025-12-01 10:25:11.766 229152 DEBUG nova.network.neutron [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing network info cache for port 91698a91-5908-4580-acd7-7dd9246226da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:25:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:12.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:13.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:14.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:14 np0005540826 nova_compute[229148]: 2025-12-01 10:25:14.956 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:14 np0005540826 nova_compute[229148]: 2025-12-01 10:25:14.979 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:15.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:15 np0005540826 nova_compute[229148]: 2025-12-01 10:25:15.716 229152 DEBUG nova.network.neutron [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updated VIF entry in instance network info cache for port 91698a91-5908-4580-acd7-7dd9246226da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:25:15 np0005540826 nova_compute[229148]: 2025-12-01 10:25:15.717 229152 DEBUG nova.network.neutron [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updating instance_info_cache with network_info: [{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:25:15 np0005540826 nova_compute[229148]: 2025-12-01 10:25:15.735 229152 DEBUG oslo_concurrency.lockutils [req-24664ed0-a76a-4912-bbe7-8526580835a9 req-bf6aeaa4-2ad7-45fa-a851-cd6b4cc7e585 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:25:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:17.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:19.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:19 np0005540826 nova_compute[229148]: 2025-12-01 10:25:19.958 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:19 np0005540826 nova_compute[229148]: 2025-12-01 10:25:19.981 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:20 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:20Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ca:92:34 10.100.0.7
Dec  1 05:25:20 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:20Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:92:34 10.100.0.7
Dec  1 05:25:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:20 np0005540826 podman[242664]: 2025-12-01 10:25:20.988008123 +0000 UTC m=+0.061554683 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec  1 05:25:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:21.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:22.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:23.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:24.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:24 np0005540826 nova_compute[229148]: 2025-12-01 10:25:24.983 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:25:24 np0005540826 nova_compute[229148]: 2025-12-01 10:25:24.985 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:25:24 np0005540826 nova_compute[229148]: 2025-12-01 10:25:24.985 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  1 05:25:24 np0005540826 nova_compute[229148]: 2025-12-01 10:25:24.985 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  1 05:25:25 np0005540826 nova_compute[229148]: 2025-12-01 10:25:25.002 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:25 np0005540826 nova_compute[229148]: 2025-12-01 10:25:25.002 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  1 05:25:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:25.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:26.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:27.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:28.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:29 np0005540826 nova_compute[229148]: 2025-12-01 10:25:29.080 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:29.080 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:25:29 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:29.082 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:25:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:29.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:30 np0005540826 nova_compute[229148]: 2025-12-01 10:25:30.001 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:30 np0005540826 nova_compute[229148]: 2025-12-01 10:25:30.004 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:30 np0005540826 podman[242713]: 2025-12-01 10:25:30.011515017 +0000 UTC m=+0.089575301 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  1 05:25:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:30.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:31.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:32.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:33 np0005540826 podman[242743]: 2025-12-01 10:25:33.467535962 +0000 UTC m=+0.047045186 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  1 05:25:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:33.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.611 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.612 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.612 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.612 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.612 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.613 229152 INFO nova.compute.manager [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Terminating instance#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.614 229152 DEBUG nova.compute.manager [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 05:25:33 np0005540826 kernel: tap91698a91-59 (unregistering): left promiscuous mode
Dec  1 05:25:33 np0005540826 NetworkManager[48989]: <info>  [1764584733.6753] device (tap91698a91-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.684 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:33Z|00101|binding|INFO|Releasing lport 91698a91-5908-4580-acd7-7dd9246226da from this chassis (sb_readonly=0)
Dec  1 05:25:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:33Z|00102|binding|INFO|Setting lport 91698a91-5908-4580-acd7-7dd9246226da down in Southbound
Dec  1 05:25:33 np0005540826 ovn_controller[132309]: 2025-12-01T10:25:33Z|00103|binding|INFO|Removing iface tap91698a91-59 ovn-installed in OVS
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.687 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.696 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:92:34 10.100.0.7'], port_security=['fa:16:3e:ca:92:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2fd6ef3f-65f1-4b07-8cb6-cf04d7943853', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4ef8ee7a-5876-4532-b61e-5a17605fc3b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd915a5f-666a-4c2a-9612-6191ae438030, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=91698a91-5908-4580-acd7-7dd9246226da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.697 141685 INFO neutron.agent.ovn.metadata.agent [-] Port 91698a91-5908-4580-acd7-7dd9246226da in datapath 82ec8f83-684f-44ae-8389-122bf8ed45ab unbound from our chassis#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.698 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82ec8f83-684f-44ae-8389-122bf8ed45ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.700 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8004fbb8-7254-4e07-b4d2-c91b87f79789]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.700 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab namespace which is not needed anymore#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.704 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  1 05:25:33 np0005540826 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 13.423s CPU time.
Dec  1 05:25:33 np0005540826 systemd-machined[192474]: Machine qemu-6-instance-0000000c terminated.
Dec  1 05:25:33 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [NOTICE]   (242603) : haproxy version is 2.8.14-c23fe91
Dec  1 05:25:33 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [NOTICE]   (242603) : path to executable is /usr/sbin/haproxy
Dec  1 05:25:33 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [WARNING]  (242603) : Exiting Master process...
Dec  1 05:25:33 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [ALERT]    (242603) : Current worker (242620) exited with code 143 (Terminated)
Dec  1 05:25:33 np0005540826 neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab[242591]: [WARNING]  (242603) : All workers exited. Exiting... (0)
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.836 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 systemd[1]: libpod-d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4.scope: Deactivated successfully.
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.840 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 podman[242786]: 2025-12-01 10:25:33.84450971 +0000 UTC m=+0.046743699 container died d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.852 229152 INFO nova.virt.libvirt.driver [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance destroyed successfully.#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.854 229152 DEBUG nova.objects.instance [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'resources' on Instance uuid 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:25:33 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4-userdata-shm.mount: Deactivated successfully.
Dec  1 05:25:33 np0005540826 systemd[1]: var-lib-containers-storage-overlay-037a1f9e4d31fa69e772a5e3b514ac2488de21de80f69623e8492434a4ec5585-merged.mount: Deactivated successfully.
Dec  1 05:25:33 np0005540826 podman[242786]: 2025-12-01 10:25:33.882830451 +0000 UTC m=+0.085064440 container cleanup d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:25:33 np0005540826 systemd[1]: libpod-conmon-d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4.scope: Deactivated successfully.
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.908 229152 DEBUG nova.virt.libvirt.vif [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:24:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-647821466',display_name='tempest-TestNetworkBasicOps-server-647821466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-647821466',id=12,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbjeoaxilM6CVu2R4EOzPCQGLqTAidZHcCxX3A7PaMSz81VzCcIIzJSrd792r/+wPgmu9RKsVWEi1wOWY+i4v+Ebg5QtWRH9mDnj21+8JWacv4KnerkrFTP+ktXsozcmA==',key_name='tempest-TestNetworkBasicOps-1274436416',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:25:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-ldupxg6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:25:07Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=2fd6ef3f-65f1-4b07-8cb6-cf04d7943853,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.908 229152 DEBUG nova.network.os_vif_util [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.909 229152 DEBUG nova.network.os_vif_util [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.909 229152 DEBUG os_vif [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.911 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.911 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91698a91-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.912 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.915 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.917 229152 INFO os_vif [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:92:34,bridge_name='br-int',has_traffic_filtering=True,id=91698a91-5908-4580-acd7-7dd9246226da,network=Network(82ec8f83-684f-44ae-8389-122bf8ed45ab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91698a91-59')#033[00m
Dec  1 05:25:33 np0005540826 podman[242822]: 2025-12-01 10:25:33.95037265 +0000 UTC m=+0.043367966 container remove d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.959 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[21d5c259-a045-4e5f-838b-295fc46974c1]: (4, ('Mon Dec  1 10:25:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab (d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4)\nd28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4\nMon Dec  1 10:25:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab (d28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4)\nd28ffdd641d0823ef23636b3358d4250524f7e4d91089217e34acf5e3502e3c4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.961 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[6850b626-2eca-46c2-ab69-569169b6afc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.962 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82ec8f83-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.963 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 kernel: tap82ec8f83-60: left promiscuous mode
Dec  1 05:25:33 np0005540826 nova_compute[229148]: 2025-12-01 10:25:33.976 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.979 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[27e80807-ca8b-45fb-99f3-e6e591c30c18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.995 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9c072957-c9f6-456b-bb5b-6bb4d98b021f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:33 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:33.996 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[e538e75f-6daf-440f-be14-4e50b6ecf6b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:34 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:34.012 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[be957829-e804-409a-bd14-70bc6a2f20b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456833, 'reachable_time': 31583, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242855, 'error': None, 'target': 'ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:34 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:34.016 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-82ec8f83-684f-44ae-8389-122bf8ed45ab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:25:34 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:34.016 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fc59dd-6ee3-48d1-9004-6c36f4738725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:25:34 np0005540826 systemd[1]: run-netns-ovnmeta\x2d82ec8f83\x2d684f\x2d44ae\x2d8389\x2d122bf8ed45ab.mount: Deactivated successfully.
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.299 229152 DEBUG nova.compute.manager [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-changed-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.299 229152 DEBUG nova.compute.manager [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing instance network info cache due to event network-changed-91698a91-5908-4580-acd7-7dd9246226da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.299 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.300 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.300 229152 DEBUG nova.network.neutron [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Refreshing network info cache for port 91698a91-5908-4580-acd7-7dd9246226da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.317 229152 INFO nova.virt.libvirt.driver [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Deleting instance files /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_del#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.318 229152 INFO nova.virt.libvirt.driver [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Deletion of /var/lib/nova/instances/2fd6ef3f-65f1-4b07-8cb6-cf04d7943853_del complete#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.374 229152 INFO nova.compute.manager [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.375 229152 DEBUG oslo.service.loopingcall [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.375 229152 DEBUG nova.compute.manager [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 05:25:34 np0005540826 nova_compute[229148]: 2025-12-01 10:25:34.375 229152 DEBUG nova.network.neutron [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 05:25:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:34.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.007 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.463 229152 DEBUG nova.network.neutron [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.487 229152 INFO nova.compute.manager [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Took 1.11 seconds to deallocate network for instance.#033[00m
Dec  1 05:25:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:35.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.691 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.691 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:35 np0005540826 nova_compute[229148]: 2025-12-01 10:25:35.746 229152 DEBUG oslo_concurrency.processutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.059 229152 DEBUG nova.network.neutron [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updated VIF entry in instance network info cache for port 91698a91-5908-4580-acd7-7dd9246226da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.060 229152 DEBUG nova.network.neutron [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Updating instance_info_cache with network_info: [{"id": "91698a91-5908-4580-acd7-7dd9246226da", "address": "fa:16:3e:ca:92:34", "network": {"id": "82ec8f83-684f-44ae-8389-122bf8ed45ab", "bridge": "br-int", "label": "tempest-network-smoke--115101625", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91698a91-59", "ovs_interfaceid": "91698a91-5908-4580-acd7-7dd9246226da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.096 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.097 229152 DEBUG nova.compute.manager [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-unplugged-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.098 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.098 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.098 229152 DEBUG oslo_concurrency.lockutils [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.098 229152 DEBUG nova.compute.manager [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] No waiting events found dispatching network-vif-unplugged-91698a91-5908-4580-acd7-7dd9246226da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.099 229152 DEBUG nova.compute.manager [req-4bb2673a-9a11-4080-a142-18c5ea9876e6 req-6979913c-b79d-41cf-8470-a21a008bd407 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-unplugged-91698a91-5908-4580-acd7-7dd9246226da for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  1 05:25:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:25:36 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4000332064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.189 229152 DEBUG oslo_concurrency.processutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.195 229152 DEBUG nova.compute.provider_tree [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.210 229152 DEBUG nova.scheduler.client.report [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.237 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.288 229152 INFO nova.scheduler.client.report [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Deleted allocations for instance 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.387 229152 DEBUG oslo_concurrency.lockutils [None req-eb66d7fc-f404-4af2-a2fc-d2bafaf1d25d 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.410 229152 DEBUG nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.411 229152 DEBUG oslo_concurrency.lockutils [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.411 229152 DEBUG oslo_concurrency.lockutils [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.411 229152 DEBUG oslo_concurrency.lockutils [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "2fd6ef3f-65f1-4b07-8cb6-cf04d7943853-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.412 229152 DEBUG nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] No waiting events found dispatching network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.412 229152 WARNING nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received unexpected event network-vif-plugged-91698a91-5908-4580-acd7-7dd9246226da for instance with vm_state deleted and task_state None.#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.412 229152 DEBUG nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Received event network-vif-deleted-91698a91-5908-4580-acd7-7dd9246226da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.413 229152 INFO nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Neutron deleted interface 91698a91-5908-4580-acd7-7dd9246226da; detaching it from the instance and deleting it from the info cache#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.413 229152 DEBUG nova.network.neutron [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Dec  1 05:25:36 np0005540826 nova_compute[229148]: 2025-12-01 10:25:36.416 229152 DEBUG nova.compute.manager [req-d1c2f5a0-5eff-4f59-af72-eda246c9fb8b req-875ef7eb-8794-4ceb-aff4-bf821a5a1cc3 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Detach interface failed, port_id=91698a91-5908-4580-acd7-7dd9246226da, reason: Instance 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec  1 05:25:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:36.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:25:37.084 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:37.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:38.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:38 np0005540826 nova_compute[229148]: 2025-12-01 10:25:38.914 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:39.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:40 np0005540826 nova_compute[229148]: 2025-12-01 10:25:40.009 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:40.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:41.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:42.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:43.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:43 np0005540826 nova_compute[229148]: 2025-12-01 10:25:43.919 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:44.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:45 np0005540826 nova_compute[229148]: 2025-12-01 10:25:45.009 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:45 np0005540826 nova_compute[229148]: 2025-12-01 10:25:45.581 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:45.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:45 np0005540826 nova_compute[229148]: 2025-12-01 10:25:45.653 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:46.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:47.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:48 np0005540826 nova_compute[229148]: 2025-12-01 10:25:48.852 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584733.849906, 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:25:48 np0005540826 nova_compute[229148]: 2025-12-01 10:25:48.852 229152 INFO nova.compute.manager [-] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:25:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:48.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:48 np0005540826 nova_compute[229148]: 2025-12-01 10:25:48.921 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:48 np0005540826 nova_compute[229148]: 2025-12-01 10:25:48.944 229152 DEBUG nova.compute.manager [None req-57247bdc-8021-4c1d-9bab-96c2a298c432 - - - - - -] [instance: 2fd6ef3f-65f1-4b07-8cb6-cf04d7943853] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:25:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:49.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:50 np0005540826 nova_compute[229148]: 2025-12-01 10:25:50.011 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:50.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:51.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:51 np0005540826 podman[242914]: 2025-12-01 10:25:51.985041768 +0000 UTC m=+0.064007213 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:25:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:52.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:53.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:53 np0005540826 nova_compute[229148]: 2025-12-01 10:25:53.924 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:54.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:55 np0005540826 nova_compute[229148]: 2025-12-01 10:25:55.063 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:55 np0005540826 nova_compute[229148]: 2025-12-01 10:25:55.126 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:55.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:56 np0005540826 nova_compute[229148]: 2025-12-01 10:25:56.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:25:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:56.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:25:57 np0005540826 nova_compute[229148]: 2025-12-01 10:25:57.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:57 np0005540826 nova_compute[229148]: 2025-12-01 10:25:57.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:25:57 np0005540826 nova_compute[229148]: 2025-12-01 10:25:57.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:25:57 np0005540826 nova_compute[229148]: 2025-12-01 10:25:57.132 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:25:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:57.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:58 np0005540826 nova_compute[229148]: 2025-12-01 10:25:58.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:58 np0005540826 nova_compute[229148]: 2025-12-01 10:25:58.928 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:25:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:58.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:59 np0005540826 nova_compute[229148]: 2025-12-01 10:25:59.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:25:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:59.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:00 np0005540826 nova_compute[229148]: 2025-12-01 10:26:00.065 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:00 np0005540826 nova_compute[229148]: 2025-12-01 10:26:00.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:00.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:01 np0005540826 podman[242940]: 2025-12-01 10:26:01.01794956 +0000 UTC m=+0.095194548 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  1 05:26:01 np0005540826 nova_compute[229148]: 2025-12-01 10:26:01.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:01 np0005540826 nova_compute[229148]: 2025-12-01 10:26:01.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:26:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.139 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.140 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.160 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.251 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.251 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.257 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.257 229152 INFO nova.compute.claims [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.386 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2558204076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.840 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.848 229152 DEBUG nova.compute.provider_tree [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.865 229152 DEBUG nova.scheduler.client.report [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.885 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.886 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.926 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.926 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  1 05:26:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:02.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.946 229152 INFO nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  1 05:26:02 np0005540826 nova_compute[229148]: 2025-12-01 10:26:02.962 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.065 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.066 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.067 229152 INFO nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Creating image(s)#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.092 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.121 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.145 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.149 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.170 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.177 229152 DEBUG nova.policy [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b56a238daf0445798410e51caada0ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f6be4e572624210b91193c011607c08', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.212 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.213 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.214 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.214 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "caad95fa2cc8ed03bed2e9851744954b07ec7b34" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.254 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.258 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.584 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/caad95fa2cc8ed03bed2e9851744954b07ec7b34 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:03.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.654 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] resizing rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.764 229152 DEBUG nova.objects.instance [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'migration_context' on Instance uuid 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.788 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.789 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Ensure instance console log exists: /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.789 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.790 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.790 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:03 np0005540826 nova_compute[229148]: 2025-12-01 10:26:03.931 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:03 np0005540826 podman[243156]: 2025-12-01 10:26:03.966287456 +0000 UTC m=+0.053811627 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:26:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:04.559 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:04.559 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:04.559 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:04.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.069 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.109 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Successfully created port: fdc3dac2-b9d1-4468-8307-a272a6efe638 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.112 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.134 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.134 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.135 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.135 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.135 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/866047266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.586 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:05.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.745 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.747 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4913MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.747 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.747 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.805 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Instance 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.806 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.806 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:26:05 np0005540826 nova_compute[229148]: 2025-12-01 10:26:05.855 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:06 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3871086598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.329 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.335 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.380 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.407 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.408 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.457 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Successfully updated port: fdc3dac2-b9d1-4468-8307-a272a6efe638 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.473 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.474 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquired lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.474 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.549 229152 DEBUG nova.compute.manager [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.550 229152 DEBUG nova.compute.manager [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing instance network info cache due to event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.550 229152 DEBUG oslo_concurrency.lockutils [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:26:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:06 np0005540826 nova_compute[229148]: 2025-12-01 10:26:06.915 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  1 05:26:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:06.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.597 229152 DEBUG nova.network.neutron [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updating instance_info_cache with network_info: [{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.619 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Releasing lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.619 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Instance network_info: |[{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.621 229152 DEBUG oslo_concurrency.lockutils [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.621 229152 DEBUG nova.network.neutron [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.625 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Start _get_guest_xml network_info=[{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '8f75d6de-6ce0-44e1-b417-d0111424475b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.630 229152 WARNING nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.639 229152 DEBUG nova.virt.libvirt.host [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  1 05:26:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.640 229152 DEBUG nova.virt.libvirt.host [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  1 05:26:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:07.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.643 229152 DEBUG nova.virt.libvirt.host [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.644 229152 DEBUG nova.virt.libvirt.host [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.644 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.645 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-01T10:14:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2e731827-1896-49cd-b0cc-12903555d217',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-01T10:14:19Z,direct_url=<?>,disk_format='qcow2',id=8f75d6de-6ce0-44e1-b417-d0111424475b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9a5734898a6345909986f17ddf57b27d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-01T10:14:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.645 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.645 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.646 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.646 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.646 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.646 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.647 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.647 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.647 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.647 229152 DEBUG nova.virt.hardware [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  1 05:26:07 np0005540826 nova_compute[229148]: 2025-12-01 10:26:07.650 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:26:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1623528846' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.114 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.142 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.147 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec  1 05:26:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1760247541' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.593 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.595 229152 DEBUG nova.virt.libvirt.vif [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:26:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710555929',display_name='tempest-TestNetworkBasicOps-server-710555929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710555929',id=13,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCJULcfZ8I5IIMKMlR0pdassHRHuTcUFJWyuYNAB+a392CmeehyQeXIhQKo6FtMH2YikcXsxBJkVcxPOc85XzYnMu9gnibnkrDfq9TT6mFvC7c+O5MtR5wWoaeFcjpoBA==',key_name='tempest-TestNetworkBasicOps-1621648275',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-rq521rmu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:26:02Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=3540e47c-6e86-4a7a-8843-8d7f1a7b01f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.595 229152 DEBUG nova.network.os_vif_util [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.596 229152 DEBUG nova.network.os_vif_util [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.598 229152 DEBUG nova.objects.instance [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.621 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] End _get_guest_xml xml=<domain type="kvm">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <uuid>3540e47c-6e86-4a7a-8843-8d7f1a7b01f6</uuid>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <name>instance-0000000d</name>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <memory>131072</memory>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <vcpu>1</vcpu>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <metadata>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:name>tempest-TestNetworkBasicOps-server-710555929</nova:name>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:creationTime>2025-12-01 10:26:07</nova:creationTime>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:flavor name="m1.nano">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:memory>128</nova:memory>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:disk>1</nova:disk>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:swap>0</nova:swap>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:ephemeral>0</nova:ephemeral>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:vcpus>1</nova:vcpus>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </nova:flavor>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:owner>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:user uuid="5b56a238daf0445798410e51caada0ff">tempest-TestNetworkBasicOps-1248115384-project-member</nova:user>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:project uuid="9f6be4e572624210b91193c011607c08">tempest-TestNetworkBasicOps-1248115384</nova:project>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </nova:owner>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:root type="image" uuid="8f75d6de-6ce0-44e1-b417-d0111424475b"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <nova:ports>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <nova:port uuid="fdc3dac2-b9d1-4468-8307-a272a6efe638">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        </nova:port>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </nova:ports>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </nova:instance>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </metadata>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <sysinfo type="smbios">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <system>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="manufacturer">RDO</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="product">OpenStack Compute</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="serial">3540e47c-6e86-4a7a-8843-8d7f1a7b01f6</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="uuid">3540e47c-6e86-4a7a-8843-8d7f1a7b01f6</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <entry name="family">Virtual Machine</entry>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </system>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </sysinfo>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <os>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <boot dev="hd"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <smbios mode="sysinfo"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </os>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <features>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <acpi/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <apic/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <vmcoreinfo/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </features>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <clock offset="utc">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <timer name="pit" tickpolicy="delay"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <timer name="hpet" present="no"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </clock>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <cpu mode="host-model" match="exact">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <topology sockets="1" cores="1" threads="1"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </cpu>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  <devices>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <disk type="network" device="disk">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <target dev="vda" bus="virtio"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <disk type="network" device="cdrom">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <driver type="raw" cache="none"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <source protocol="rbd" name="vms/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.100" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.102" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <host name="192.168.122.101" port="6789"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </source>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <auth username="openstack">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:        <secret type="ceph" uuid="365f19c2-81e5-5edd-b6b4-280555214d3a"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      </auth>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <target dev="sda" bus="sata"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </disk>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <interface type="ethernet">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <mac address="fa:16:3e:a0:86:0b"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <driver name="vhost" rx_queue_size="512"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <mtu size="1442"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <target dev="tapfdc3dac2-b9"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </interface>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <serial type="pty">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <log file="/var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/console.log" append="off"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </serial>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <video>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <model type="virtio"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </video>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <input type="tablet" bus="usb"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <rng model="virtio">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <backend model="random">/dev/urandom</backend>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </rng>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="pci" model="pcie-root-port"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <controller type="usb" index="0"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    <memballoon model="virtio">
Dec  1 05:26:08 np0005540826 nova_compute[229148]:      <stats period="10"/>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:    </memballoon>
Dec  1 05:26:08 np0005540826 nova_compute[229148]:  </devices>
Dec  1 05:26:08 np0005540826 nova_compute[229148]: </domain>
Dec  1 05:26:08 np0005540826 nova_compute[229148]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.622 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Preparing to wait for external event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.623 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.623 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.623 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.624 229152 DEBUG nova.virt.libvirt.vif [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-01T10:26:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710555929',display_name='tempest-TestNetworkBasicOps-server-710555929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710555929',id=13,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCJULcfZ8I5IIMKMlR0pdassHRHuTcUFJWyuYNAB+a392CmeehyQeXIhQKo6FtMH2YikcXsxBJkVcxPOc85XzYnMu9gnibnkrDfq9TT6mFvC7c+O5MtR5wWoaeFcjpoBA==',key_name='tempest-TestNetworkBasicOps-1621648275',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-rq521rmu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-01T10:26:02Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=3540e47c-6e86-4a7a-8843-8d7f1a7b01f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.624 229152 DEBUG nova.network.os_vif_util [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.625 229152 DEBUG nova.network.os_vif_util [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.625 229152 DEBUG os_vif [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.626 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.627 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.627 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.631 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.631 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdc3dac2-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.632 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfdc3dac2-b9, col_values=(('external_ids', {'iface-id': 'fdc3dac2-b9d1-4468-8307-a272a6efe638', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:86:0b', 'vm-uuid': '3540e47c-6e86-4a7a-8843-8d7f1a7b01f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.633 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:08 np0005540826 NetworkManager[48989]: <info>  [1764584768.6345] manager: (tapfdc3dac2-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.635 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.640 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.641 229152 INFO os_vif [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9')#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.700 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.700 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.700 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] No VIF found with MAC fa:16:3e:a0:86:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.701 229152 INFO nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Using config drive#033[00m
Dec  1 05:26:08 np0005540826 nova_compute[229148]: 2025-12-01 10:26:08.726 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:08.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.308 229152 INFO nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Creating config drive at /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.313 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0t8fjmnx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.335 229152 DEBUG nova.network.neutron [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updated VIF entry in instance network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.336 229152 DEBUG nova.network.neutron [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updating instance_info_cache with network_info: [{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.358 229152 DEBUG oslo_concurrency.lockutils [req-fa87d423-0dc2-4bc6-92b7-93b971787e11 req-93d0d32d-51ba-431e-a911-93f598d0ad8f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.442 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0t8fjmnx" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.471 229152 DEBUG nova.storage.rbd_utils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] rbd image 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.475 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.636 229152 DEBUG oslo_concurrency.processutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.637 229152 INFO nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Deleting local config drive /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6/disk.config because it was imported into RBD.#033[00m
Dec  1 05:26:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:09.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:09 np0005540826 kernel: tapfdc3dac2-b9: entered promiscuous mode
Dec  1 05:26:09 np0005540826 NetworkManager[48989]: <info>  [1764584769.7022] manager: (tapfdc3dac2-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Dec  1 05:26:09 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:09Z|00104|binding|INFO|Claiming lport fdc3dac2-b9d1-4468-8307-a272a6efe638 for this chassis.
Dec  1 05:26:09 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:09Z|00105|binding|INFO|fdc3dac2-b9d1-4468-8307-a272a6efe638: Claiming fa:16:3e:a0:86:0b 10.100.0.13
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.703 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.709 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.721 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:86:0b 10.100.0.13'], port_security=['fa:16:3e:a0:86:0b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3540e47c-6e86-4a7a-8843-8d7f1a7b01f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ab4b830-6e4f-4874-8389-c75ccb124517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '2', 'neutron:security_group_ids': '511c736b-c125-4be8-86f9-d725384a5e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f739f97a-512c-42d9-b283-488e92f014a6, chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=fdc3dac2-b9d1-4468-8307-a272a6efe638) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.722 141685 INFO neutron.agent.ovn.metadata.agent [-] Port fdc3dac2-b9d1-4468-8307-a272a6efe638 in datapath 9ab4b830-6e4f-4874-8389-c75ccb124517 bound to our chassis#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.723 141685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9ab4b830-6e4f-4874-8389-c75ccb124517#033[00m
Dec  1 05:26:09 np0005540826 systemd-udevd[243383]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.737 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ef827861-31d9-4997-825c-9f9344032d6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.738 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9ab4b830-61 in ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  1 05:26:09 np0005540826 systemd-machined[192474]: New machine qemu-7-instance-0000000d.
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.740 233565 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9ab4b830-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.740 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[4e24a973-9ded-4ca4-bb78-cb2192c10bf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.741 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[adf2dc9a-0e89-4832-afcc-ac204a4d68ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 NetworkManager[48989]: <info>  [1764584769.7487] device (tapfdc3dac2-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 05:26:09 np0005540826 NetworkManager[48989]: <info>  [1764584769.7505] device (tapfdc3dac2-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.753 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[20368021-0034-4ef1-990c-739ca5b200fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 systemd[1]: Started Virtual Machine qemu-7-instance-0000000d.
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.767 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:09 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:09Z|00106|binding|INFO|Setting lport fdc3dac2-b9d1-4468-8307-a272a6efe638 ovn-installed in OVS
Dec  1 05:26:09 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:09Z|00107|binding|INFO|Setting lport fdc3dac2-b9d1-4468-8307-a272a6efe638 up in Southbound
Dec  1 05:26:09 np0005540826 nova_compute[229148]: 2025-12-01 10:26:09.775 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.778 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[26e3538b-4f4e-48a1-81bc-3beaf86c1f1c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.808 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa3bf85-6a5f-4bd3-b876-4fbe2a05978c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.813 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[68d4cbf9-bb93-4363-9fed-68023cfb78bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 NetworkManager[48989]: <info>  [1764584769.8146] manager: (tap9ab4b830-60): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.843 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[f915be55-6e2a-4ea7-87be-f90e8e4510d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.846 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[6bac8045-6016-40e1-8e85-0f6dd1699d4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 NetworkManager[48989]: <info>  [1764584769.8652] device (tap9ab4b830-60): carrier: link connected
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.869 233643 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9beaab-cfb6-4047-896c-b18c17b5e715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.887 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd7e5b4-4983-4b50-9768-2ac4b072e07e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ab4b830-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:96:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463238, 'reachable_time': 19785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243416, 'error': None, 'target': 'ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.905 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[b1003c73-a826-47f8-8c53-17c5ebb49a07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:965b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463238, 'tstamp': 463238}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243417, 'error': None, 'target': 'ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.926 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7de7ea-8e22-4261-948d-f59dc2e94c36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ab4b830-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:96:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463238, 'reachable_time': 19785, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243418, 'error': None, 'target': 'ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:09 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:09.963 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[e8565f5a-e896-4e23-9ea0-245356b24974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.026 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7f8938-d7d8-49e0-b299-99f831b41ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.027 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ab4b830-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.028 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.029 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ab4b830-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:10 np0005540826 kernel: tap9ab4b830-60: entered promiscuous mode
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.034 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:10 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:26:10 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:26:10 np0005540826 NetworkManager[48989]: <info>  [1764584770.0383] manager: (tap9ab4b830-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.038 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9ab4b830-60, col_values=(('external_ids', {'iface-id': '2b75a10e-a855-4ff1-9d02-69f69c832f5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:10 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:10Z|00108|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.040 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.041 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.041 141685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9ab4b830-6e4f-4874-8389-c75ccb124517.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9ab4b830-6e4f-4874-8389-c75ccb124517.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.043 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[a0aa93d8-9e4d-42c0-b7dc-7a2bf3420d17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.044 141685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: global
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    log         /dev/log local0 debug
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    log-tag     haproxy-metadata-proxy-9ab4b830-6e4f-4874-8389-c75ccb124517
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    user        root
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    group       root
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    maxconn     1024
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    pidfile     /var/lib/neutron/external/pids/9ab4b830-6e4f-4874-8389-c75ccb124517.pid.haproxy
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    daemon
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: defaults
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    log global
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    mode http
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    option httplog
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    option dontlognull
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    option http-server-close
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    option forwardfor
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    retries                 3
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    timeout http-request    30s
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    timeout connect         30s
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    timeout client          32s
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    timeout server          32s
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    timeout http-keep-alive 30s
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: listen listener
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    bind 169.254.169.254:80
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    server metadata /var/lib/neutron/metadata_proxy
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]:    http-request add-header X-OVN-Network-ID 9ab4b830-6e4f-4874-8389-c75ccb124517
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  1 05:26:10 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:10.044 141685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517', 'env', 'PROCESS_TAG=haproxy-9ab4b830-6e4f-4874-8389-c75ccb124517', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9ab4b830-6e4f-4874-8389-c75ccb124517.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.054 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.068 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.131 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584770.1301017, 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.131 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] VM Started (Lifecycle Event)#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.151 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.159 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584770.1304612, 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.160 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] VM Paused (Lifecycle Event)#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.179 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.182 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.206 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:26:10 np0005540826 podman[243493]: 2025-12-01 10:26:10.41644921 +0000 UTC m=+0.055254334 container create 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.437 229152 DEBUG nova.compute.manager [req-e72d66f5-e54b-4ed2-9053-af3e5bc1ddd3 req-d62d1d59-2173-45ac-83bc-c323ec068190 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.438 229152 DEBUG oslo_concurrency.lockutils [req-e72d66f5-e54b-4ed2-9053-af3e5bc1ddd3 req-d62d1d59-2173-45ac-83bc-c323ec068190 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.438 229152 DEBUG oslo_concurrency.lockutils [req-e72d66f5-e54b-4ed2-9053-af3e5bc1ddd3 req-d62d1d59-2173-45ac-83bc-c323ec068190 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.438 229152 DEBUG oslo_concurrency.lockutils [req-e72d66f5-e54b-4ed2-9053-af3e5bc1ddd3 req-d62d1d59-2173-45ac-83bc-c323ec068190 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.438 229152 DEBUG nova.compute.manager [req-e72d66f5-e54b-4ed2-9053-af3e5bc1ddd3 req-d62d1d59-2173-45ac-83bc-c323ec068190 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Processing event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.439 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.444 229152 DEBUG nova.virt.driver [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] Emitting event <LifecycleEvent: 1764584770.4435935, 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.444 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] VM Resumed (Lifecycle Event)#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.448 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.451 229152 INFO nova.virt.libvirt.driver [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Instance spawned successfully.#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.451 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.468 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:26:10 np0005540826 systemd[1]: Started libpod-conmon-49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7.scope.
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.476 229152 DEBUG nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  1 05:26:10 np0005540826 podman[243493]: 2025-12-01 10:26:10.383507784 +0000 UTC m=+0.022312918 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.480 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.480 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.481 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.481 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.482 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.482 229152 DEBUG nova.virt.libvirt.driver [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  1 05:26:10 np0005540826 systemd[1]: Started libcrun container.
Dec  1 05:26:10 np0005540826 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e865b783acf00d8416dbbed118393538747baafb619f4c6a6e9cef16cf12f05/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.517 229152 INFO nova.compute.manager [None req-7c282b0b-df38-48e2-ab6c-344463e0a131 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  1 05:26:10 np0005540826 podman[243493]: 2025-12-01 10:26:10.526644797 +0000 UTC m=+0.165449941 container init 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Dec  1 05:26:10 np0005540826 podman[243493]: 2025-12-01 10:26:10.532217749 +0000 UTC m=+0.171022873 container start 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 05:26:10 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [NOTICE]   (243513) : New worker (243515) forked
Dec  1 05:26:10 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [NOTICE]   (243513) : Loading success.
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.568 229152 INFO nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Took 7.50 seconds to spawn the instance on the hypervisor.#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.568 229152 DEBUG nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.658 229152 INFO nova.compute.manager [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Took 8.44 seconds to build instance.#033[00m
Dec  1 05:26:10 np0005540826 nova_compute[229148]: 2025-12-01 10:26:10.677 229152 DEBUG oslo_concurrency.lockutils [None req-b85006f6-18e3-4b6e-a5e4-4d06256b139b 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:10.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:11.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.535 229152 DEBUG nova.compute.manager [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.536 229152 DEBUG oslo_concurrency.lockutils [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.536 229152 DEBUG oslo_concurrency.lockutils [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.536 229152 DEBUG oslo_concurrency.lockutils [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.536 229152 DEBUG nova.compute.manager [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] No waiting events found dispatching network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.537 229152 WARNING nova.compute.manager [req-d542f39e-ea63-49ce-bc17-e10971fabb5a req-426c5441-a439-4b76-9836-0777c03ed274 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received unexpected event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 for instance with vm_state active and task_state None.#033[00m
Dec  1 05:26:12 np0005540826 NetworkManager[48989]: <info>  [1764584772.6572] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec  1 05:26:12 np0005540826 NetworkManager[48989]: <info>  [1764584772.6581] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Dec  1 05:26:12 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:12Z|00109|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.656 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:12 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:12Z|00110|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.689 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:12 np0005540826 nova_compute[229148]: 2025-12-01 10:26:12.695 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:12.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:13 np0005540826 nova_compute[229148]: 2025-12-01 10:26:13.634 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:13.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:14 np0005540826 nova_compute[229148]: 2025-12-01 10:26:14.632 229152 DEBUG nova.compute.manager [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:14 np0005540826 nova_compute[229148]: 2025-12-01 10:26:14.632 229152 DEBUG nova.compute.manager [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing instance network info cache due to event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:26:14 np0005540826 nova_compute[229148]: 2025-12-01 10:26:14.632 229152 DEBUG oslo_concurrency.lockutils [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:26:14 np0005540826 nova_compute[229148]: 2025-12-01 10:26:14.633 229152 DEBUG oslo_concurrency.lockutils [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:26:14 np0005540826 nova_compute[229148]: 2025-12-01 10:26:14.633 229152 DEBUG nova.network.neutron [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:26:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:14.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:15 np0005540826 nova_compute[229148]: 2025-12-01 10:26:15.070 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:15 np0005540826 nova_compute[229148]: 2025-12-01 10:26:15.594 229152 DEBUG nova.network.neutron [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updated VIF entry in instance network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:26:15 np0005540826 nova_compute[229148]: 2025-12-01 10:26:15.595 229152 DEBUG nova.network.neutron [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updating instance_info_cache with network_info: [{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:26:15 np0005540826 nova_compute[229148]: 2025-12-01 10:26:15.617 229152 DEBUG oslo_concurrency.lockutils [req-d4ca51be-1245-43b4-aa8e-5e03465ab8a8 req-c9d4b20e-89c2-4b68-bba8-6d4f4a085c22 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:26:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:15.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:26:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:16.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:17.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:18 np0005540826 nova_compute[229148]: 2025-12-01 10:26:18.638 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:19.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:20 np0005540826 nova_compute[229148]: 2025-12-01 10:26:20.072 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:20.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:21.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:22 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:22 np0005540826 podman[243639]: 2025-12-01 10:26:22.991903159 +0000 UTC m=+0.064074857 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:26:23 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:23Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:86:0b 10.100.0.13
Dec  1 05:26:23 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:23Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:86:0b 10.100.0.13
Dec  1 05:26:23 np0005540826 nova_compute[229148]: 2025-12-01 10:26:23.642 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:23.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 05:26:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:24.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 05:26:25 np0005540826 nova_compute[229148]: 2025-12-01 10:26:25.091 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:25.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:26.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.510606) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787510655, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2360, "num_deletes": 251, "total_data_size": 6258305, "memory_usage": 6355496, "flush_reason": "Manual Compaction"}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787635587, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4044953, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31164, "largest_seqno": 33519, "table_properties": {"data_size": 4035372, "index_size": 6011, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20031, "raw_average_key_size": 20, "raw_value_size": 4016184, "raw_average_value_size": 4110, "num_data_blocks": 258, "num_entries": 977, "num_filter_entries": 977, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584581, "oldest_key_time": 1764584581, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 125083 microseconds, and 15439 cpu microseconds.
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.635677) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4044953 bytes OK
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.635712) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.637538) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.637562) EVENT_LOG_v1 {"time_micros": 1764584787637554, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.637588) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6247810, prev total WAL file size 6247810, number of live WAL files 2.
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.640422) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3950KB)], [60(12MB)]
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787640460, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16801685, "oldest_snapshot_seqno": -1}
Dec  1 05:26:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:27.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6273 keys, 14622258 bytes, temperature: kUnknown
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787840240, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14622258, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14580681, "index_size": 24763, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15749, "raw_key_size": 160572, "raw_average_key_size": 25, "raw_value_size": 14467995, "raw_average_value_size": 2306, "num_data_blocks": 995, "num_entries": 6273, "num_filter_entries": 6273, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.840578) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14622258 bytes
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.881220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.1 rd, 73.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.2 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 6794, records dropped: 521 output_compression: NoCompression
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.881270) EVENT_LOG_v1 {"time_micros": 1764584787881252, "job": 36, "event": "compaction_finished", "compaction_time_micros": 199887, "compaction_time_cpu_micros": 31459, "output_level": 6, "num_output_files": 1, "total_output_size": 14622258, "num_input_records": 6794, "num_output_records": 6273, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787882408, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787884868, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.640321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.884959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.884967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.884969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.884971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:26:27.884973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:28 np0005540826 nova_compute[229148]: 2025-12-01 10:26:28.645 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:28.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:29 np0005540826 nova_compute[229148]: 2025-12-01 10:26:29.649 229152 INFO nova.compute.manager [None req-a8262bf5-9372-4746-9184-44ae5eed66cf 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Get console output#033[00m
Dec  1 05:26:29 np0005540826 nova_compute[229148]: 2025-12-01 10:26:29.655 234904 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  1 05:26:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:29.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:30 np0005540826 nova_compute[229148]: 2025-12-01 10:26:30.093 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:30 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:30Z|00111|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:30 np0005540826 nova_compute[229148]: 2025-12-01 10:26:30.741 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:30 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:30Z|00112|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:30 np0005540826 nova_compute[229148]: 2025-12-01 10:26:30.797 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:30.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:31.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:32 np0005540826 podman[243689]: 2025-12-01 10:26:32.003036885 +0000 UTC m=+0.083663225 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec  1 05:26:32 np0005540826 nova_compute[229148]: 2025-12-01 10:26:32.728 229152 INFO nova.compute.manager [None req-c476a45a-55ba-4539-93ed-7cb4d83e2c4f 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Get console output#033[00m
Dec  1 05:26:32 np0005540826 nova_compute[229148]: 2025-12-01 10:26:32.734 234904 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  1 05:26:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:32.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:33 np0005540826 nova_compute[229148]: 2025-12-01 10:26:33.648 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:33.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:34 np0005540826 nova_compute[229148]: 2025-12-01 10:26:34.360 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:34 np0005540826 NetworkManager[48989]: <info>  [1764584794.3607] manager: (patch-provnet-da274a4a-a49c-4f01-b728-391696cd2672-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec  1 05:26:34 np0005540826 NetworkManager[48989]: <info>  [1764584794.3620] manager: (patch-br-int-to-provnet-da274a4a-a49c-4f01-b728-391696cd2672): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec  1 05:26:34 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:34Z|00113|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:34 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:34Z|00114|binding|INFO|Releasing lport 2b75a10e-a855-4ff1-9d02-69f69c832f5c from this chassis (sb_readonly=0)
Dec  1 05:26:34 np0005540826 nova_compute[229148]: 2025-12-01 10:26:34.431 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:34 np0005540826 nova_compute[229148]: 2025-12-01 10:26:34.844 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:34 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:34.843 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:26:34 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:34.846 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:26:34 np0005540826 podman[243719]: 2025-12-01 10:26:34.966863365 +0000 UTC m=+0.048522403 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  1 05:26:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:34.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:35 np0005540826 nova_compute[229148]: 2025-12-01 10:26:35.061 229152 INFO nova.compute.manager [None req-7bc9eff7-4764-49be-991c-0234dbe1fdff 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Get console output#033[00m
Dec  1 05:26:35 np0005540826 nova_compute[229148]: 2025-12-01 10:26:35.068 234904 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  1 05:26:35 np0005540826 nova_compute[229148]: 2025-12-01 10:26:35.096 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:35.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.594 229152 DEBUG nova.compute.manager [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.595 229152 DEBUG nova.compute.manager [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing instance network info cache due to event network-changed-fdc3dac2-b9d1-4468-8307-a272a6efe638. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.595 229152 DEBUG oslo_concurrency.lockutils [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.595 229152 DEBUG oslo_concurrency.lockutils [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquired lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.595 229152 DEBUG nova.network.neutron [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Refreshing network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.688 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.688 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.689 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.689 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.689 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.690 229152 INFO nova.compute.manager [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Terminating instance#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.691 229152 DEBUG nova.compute.manager [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  1 05:26:36 np0005540826 kernel: tapfdc3dac2-b9 (unregistering): left promiscuous mode
Dec  1 05:26:36 np0005540826 NetworkManager[48989]: <info>  [1764584796.7589] device (tapfdc3dac2-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  1 05:26:36 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:36Z|00115|binding|INFO|Releasing lport fdc3dac2-b9d1-4468-8307-a272a6efe638 from this chassis (sb_readonly=0)
Dec  1 05:26:36 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:36Z|00116|binding|INFO|Setting lport fdc3dac2-b9d1-4468-8307-a272a6efe638 down in Southbound
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.771 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:36 np0005540826 ovn_controller[132309]: 2025-12-01T10:26:36Z|00117|binding|INFO|Removing iface tapfdc3dac2-b9 ovn-installed in OVS
Dec  1 05:26:36 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:36.778 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:86:0b 10.100.0.13'], port_security=['fa:16:3e:a0:86:0b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3540e47c-6e86-4a7a-8843-8d7f1a7b01f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ab4b830-6e4f-4874-8389-c75ccb124517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f6be4e572624210b91193c011607c08', 'neutron:revision_number': '4', 'neutron:security_group_ids': '511c736b-c125-4be8-86f9-d725384a5e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f739f97a-512c-42d9-b283-488e92f014a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>], logical_port=fdc3dac2-b9d1-4468-8307-a272a6efe638) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe07c6626d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:26:36 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:36.779 141685 INFO neutron.agent.ovn.metadata.agent [-] Port fdc3dac2-b9d1-4468-8307-a272a6efe638 in datapath 9ab4b830-6e4f-4874-8389-c75ccb124517 unbound from our chassis#033[00m
Dec  1 05:26:36 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:36.780 141685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ab4b830-6e4f-4874-8389-c75ccb124517, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  1 05:26:36 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:36.785 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[cd12b442-f416-450e-bbe3-bddd92b5d26e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:36 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:36.787 141685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517 namespace which is not needed anymore#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.792 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:36 np0005540826 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec  1 05:26:36 np0005540826 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000d.scope: Consumed 13.285s CPU time.
Dec  1 05:26:36 np0005540826 systemd-machined[192474]: Machine qemu-7-instance-0000000d terminated.
Dec  1 05:26:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:36 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [NOTICE]   (243513) : haproxy version is 2.8.14-c23fe91
Dec  1 05:26:36 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [NOTICE]   (243513) : path to executable is /usr/sbin/haproxy
Dec  1 05:26:36 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [WARNING]  (243513) : Exiting Master process...
Dec  1 05:26:36 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [ALERT]    (243513) : Current worker (243515) exited with code 143 (Terminated)
Dec  1 05:26:36 np0005540826 neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517[243509]: [WARNING]  (243513) : All workers exited. Exiting... (0)
Dec  1 05:26:36 np0005540826 systemd[1]: libpod-49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7.scope: Deactivated successfully.
Dec  1 05:26:36 np0005540826 podman[243763]: 2025-12-01 10:26:36.93230918 +0000 UTC m=+0.052191376 container died 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.936 229152 INFO nova.virt.libvirt.driver [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Instance destroyed successfully.#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.938 229152 DEBUG nova.objects.instance [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lazy-loading 'resources' on Instance uuid 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.953 229152 DEBUG nova.virt.libvirt.vif [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-01T10:26:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710555929',display_name='tempest-TestNetworkBasicOps-server-710555929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710555929',id=13,image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOCJULcfZ8I5IIMKMlR0pdassHRHuTcUFJWyuYNAB+a392CmeehyQeXIhQKo6FtMH2YikcXsxBJkVcxPOc85XzYnMu9gnibnkrDfq9TT6mFvC7c+O5MtR5wWoaeFcjpoBA==',key_name='tempest-TestNetworkBasicOps-1621648275',keypairs=<?>,launch_index=0,launched_at=2025-12-01T10:26:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f6be4e572624210b91193c011607c08',ramdisk_id='',reservation_id='r-rq521rmu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8f75d6de-6ce0-44e1-b417-d0111424475b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1248115384',owner_user_name='tempest-TestNetworkBasicOps-1248115384-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-01T10:26:10Z,user_data=None,user_id='5b56a238daf0445798410e51caada0ff',uuid=3540e47c-6e86-4a7a-8843-8d7f1a7b01f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.954 229152 DEBUG nova.network.os_vif_util [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converting VIF {"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.955 229152 DEBUG nova.network.os_vif_util [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.955 229152 DEBUG os_vif [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.959 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.959 229152 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdc3dac2-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.961 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.964 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  1 05:26:36 np0005540826 nova_compute[229148]: 2025-12-01 10:26:36.967 229152 INFO os_vif [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:86:0b,bridge_name='br-int',has_traffic_filtering=True,id=fdc3dac2-b9d1-4468-8307-a272a6efe638,network=Network(9ab4b830-6e4f-4874-8389-c75ccb124517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdc3dac2-b9')#033[00m
Dec  1 05:26:36 np0005540826 systemd[1]: var-lib-containers-storage-overlay-1e865b783acf00d8416dbbed118393538747baafb619f4c6a6e9cef16cf12f05-merged.mount: Deactivated successfully.
Dec  1 05:26:36 np0005540826 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7-userdata-shm.mount: Deactivated successfully.
Dec  1 05:26:36 np0005540826 podman[243763]: 2025-12-01 10:26:36.977977779 +0000 UTC m=+0.097860035 container cleanup 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec  1 05:26:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:36.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:37 np0005540826 systemd[1]: libpod-conmon-49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7.scope: Deactivated successfully.
Dec  1 05:26:37 np0005540826 podman[243812]: 2025-12-01 10:26:37.049878305 +0000 UTC m=+0.041417043 container remove 49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.056 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[75d690e1-849e-48a4-b5e6-8778cca312ea]: (4, ('Mon Dec  1 10:26:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517 (49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7)\n49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7\nMon Dec  1 10:26:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517 (49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7)\n49171569bc7cee58e2a373e4973146bedfb0ec324cd81bd23276b3cb01a863d7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.057 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5308fa-f5f7-49dc-8c00-5934e6128497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.058 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ab4b830-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.061 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:37 np0005540826 kernel: tap9ab4b830-60: left promiscuous mode
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.073 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.075 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[7c37db77-bffd-4383-b882-66cc744ae672]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.091 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[47a9cb8c-f586-4d0f-a505-32d5254ddb5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.092 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1e994d-4034-4083-8927-2120b08a513e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.108 233565 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8514aa-d5c6-49b6-aed3-f3ff8fcf6d57]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463232, 'reachable_time': 29854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243831, 'error': None, 'target': 'ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 systemd[1]: run-netns-ovnmeta\x2d9ab4b830\x2d6e4f\x2d4874\x2d8389\x2dc75ccb124517.mount: Deactivated successfully.
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.112 141797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9ab4b830-6e4f-4874-8389-c75ccb124517 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  1 05:26:37 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:37.113 141797 DEBUG oslo.privsep.daemon [-] privsep: reply[f9aa9710-a7c4-49b9-ae77-12589c3bd75a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:26:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:37.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.909 229152 INFO nova.virt.libvirt.driver [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Deleting instance files /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_del#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.911 229152 INFO nova.virt.libvirt.driver [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Deletion of /var/lib/nova/instances/3540e47c-6e86-4a7a-8843-8d7f1a7b01f6_del complete#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.972 229152 INFO nova.compute.manager [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Took 1.28 seconds to destroy the instance on the hypervisor.#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.973 229152 DEBUG oslo.service.loopingcall [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.973 229152 DEBUG nova.compute.manager [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  1 05:26:37 np0005540826 nova_compute[229148]: 2025-12-01 10:26:37.974 229152 DEBUG nova.network.neutron [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.681 229152 DEBUG nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-unplugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.682 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.682 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.682 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.682 229152 DEBUG nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] No waiting events found dispatching network-vif-unplugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.683 229152 DEBUG nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-unplugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.683 229152 DEBUG nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.683 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Acquiring lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.684 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.684 229152 DEBUG oslo_concurrency.lockutils [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.684 229152 DEBUG nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] No waiting events found dispatching network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  1 05:26:38 np0005540826 nova_compute[229148]: 2025-12-01 10:26:38.685 229152 WARNING nova.compute.manager [req-2e0bef8c-393e-418d-833e-ebf6542eb199 req-1bc6731e-aaf4-4609-9ac7-46155106df4f dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received unexpected event network-vif-plugged-fdc3dac2-b9d1-4468-8307-a272a6efe638 for instance with vm_state active and task_state deleting.#033[00m
Dec  1 05:26:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:38.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.476 229152 DEBUG nova.network.neutron [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.496 229152 INFO nova.compute.manager [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Took 1.52 seconds to deallocate network for instance.#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.548 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.549 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.557 229152 DEBUG nova.compute.manager [req-d6066f93-c446-4f15-bd7b-3dc36ad992dd req-62a9b147-8074-40aa-bc0d-b8583187f52e dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Received event network-vif-deleted-fdc3dac2-b9d1-4468-8307-a272a6efe638 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.602 229152 DEBUG oslo_concurrency.processutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:39.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.694 229152 DEBUG nova.network.neutron [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updated VIF entry in instance network info cache for port fdc3dac2-b9d1-4468-8307-a272a6efe638. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.696 229152 DEBUG nova.network.neutron [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Updating instance_info_cache with network_info: [{"id": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "address": "fa:16:3e:a0:86:0b", "network": {"id": "9ab4b830-6e4f-4874-8389-c75ccb124517", "bridge": "br-int", "label": "tempest-network-smoke--1868335417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f6be4e572624210b91193c011607c08", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdc3dac2-b9", "ovs_interfaceid": "fdc3dac2-b9d1-4468-8307-a272a6efe638", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  1 05:26:39 np0005540826 nova_compute[229148]: 2025-12-01 10:26:39.721 229152 DEBUG oslo_concurrency.lockutils [req-20ff4d85-dac8-40c2-bef8-96abf5d7a589 req-433083b7-cf1c-4baa-97ec-3668aec15234 dacba8d8330f4064ba77b4caeb0c4756 701c7475017845dbbaa4460b007ffc6f - - default default] Releasing lock "refresh_cache-3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:26:40 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:40 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1709413327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.135 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.155 229152 DEBUG oslo_concurrency.processutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.162 229152 DEBUG nova.compute.provider_tree [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.177 229152 DEBUG nova.scheduler.client.report [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.202 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.234 229152 INFO nova.scheduler.client.report [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Deleted allocations for instance 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6#033[00m
Dec  1 05:26:40 np0005540826 nova_compute[229148]: 2025-12-01 10:26:40.319 229152 DEBUG oslo_concurrency.lockutils [None req-f06e7b7e-d53e-4abe-a5b6-f4a0975d58c9 5b56a238daf0445798410e51caada0ff 9f6be4e572624210b91193c011607c08 - - default default] Lock "3540e47c-6e86-4a7a-8843-8d7f1a7b01f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:40 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:26:40.848 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:40.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:41 np0005540826 nova_compute[229148]: 2025-12-01 10:26:41.964 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:26:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:42.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:26:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:43.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:45.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:45 np0005540826 nova_compute[229148]: 2025-12-01 10:26:45.021 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:45 np0005540826 nova_compute[229148]: 2025-12-01 10:26:45.095 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:45 np0005540826 nova_compute[229148]: 2025-12-01 10:26:45.138 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:45.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:46 np0005540826 nova_compute[229148]: 2025-12-01 10:26:46.968 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:47.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:47.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:49.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:49.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:50 np0005540826 nova_compute[229148]: 2025-12-01 10:26:50.140 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:51.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec  1 05:26:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:51.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec  1 05:26:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:51 np0005540826 nova_compute[229148]: 2025-12-01 10:26:51.936 229152 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764584796.930495, 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  1 05:26:51 np0005540826 nova_compute[229148]: 2025-12-01 10:26:51.939 229152 INFO nova.compute.manager [-] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] VM Stopped (Lifecycle Event)#033[00m
Dec  1 05:26:51 np0005540826 nova_compute[229148]: 2025-12-01 10:26:51.968 229152 DEBUG nova.compute.manager [None req-e8543110-8006-4e3d-9614-d87f47557f84 - - - - - -] [instance: 3540e47c-6e86-4a7a-8843-8d7f1a7b01f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  1 05:26:51 np0005540826 nova_compute[229148]: 2025-12-01 10:26:51.972 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:53.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:53.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:53 np0005540826 podman[243889]: 2025-12-01 10:26:53.995028254 +0000 UTC m=+0.068160552 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec  1 05:26:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:55.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:55 np0005540826 nova_compute[229148]: 2025-12-01 10:26:55.141 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:55.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:56 np0005540826 nova_compute[229148]: 2025-12-01 10:26:56.976 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:26:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:26:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:57.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:26:57 np0005540826 nova_compute[229148]: 2025-12-01 10:26:57.405 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:57 np0005540826 nova_compute[229148]: 2025-12-01 10:26:57.406 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:26:57 np0005540826 nova_compute[229148]: 2025-12-01 10:26:57.406 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:26:57 np0005540826 nova_compute[229148]: 2025-12-01 10:26:57.429 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:26:57 np0005540826 nova_compute[229148]: 2025-12-01 10:26:57.429 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:57.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:59.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:26:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:59.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:00 np0005540826 nova_compute[229148]: 2025-12-01 10:27:00.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:00 np0005540826 nova_compute[229148]: 2025-12-01 10:27:00.176 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:01.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:01 np0005540826 nova_compute[229148]: 2025-12-01 10:27:01.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:01.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:01 np0005540826 nova_compute[229148]: 2025-12-01 10:27:01.980 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:02 np0005540826 nova_compute[229148]: 2025-12-01 10:27:02.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:02 np0005540826 nova_compute[229148]: 2025-12-01 10:27:02.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:02 np0005540826 nova_compute[229148]: 2025-12-01 10:27:02.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:27:02 np0005540826 podman[243916]: 2025-12-01 10:27:02.999550201 +0000 UTC m=+0.079850488 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:27:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:03.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:03.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:04 np0005540826 nova_compute[229148]: 2025-12-01 10:27:04.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:04 np0005540826 nova_compute[229148]: 2025-12-01 10:27:04.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:27:04.560 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:27:04.561 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:27:04.561 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:27:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:05.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:27:05 np0005540826 nova_compute[229148]: 2025-12-01 10:27:05.178 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:05.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:05 np0005540826 podman[243942]: 2025-12-01 10:27:05.962512609 +0000 UTC m=+0.045778833 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:27:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:06 np0005540826 nova_compute[229148]: 2025-12-01 10:27:06.983 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:07.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1614089083' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1614089083' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.134 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.134 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.134 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.135 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.135 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:27:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2876913939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.583 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:27:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:07.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.789 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.791 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4913MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.791 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.792 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.851 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.852 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.874 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.892 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.893 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.905 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.926 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:27:07 np0005540826 nova_compute[229148]: 2025-12-01 10:27:07.948 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:27:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:27:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/662702271' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:27:08 np0005540826 nova_compute[229148]: 2025-12-01 10:27:08.423 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:27:08 np0005540826 nova_compute[229148]: 2025-12-01 10:27:08.432 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:27:08 np0005540826 nova_compute[229148]: 2025-12-01 10:27:08.449 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:27:08 np0005540826 nova_compute[229148]: 2025-12-01 10:27:08.473 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:27:08 np0005540826 nova_compute[229148]: 2025-12-01 10:27:08.474 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:09.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:09.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:10 np0005540826 nova_compute[229148]: 2025-12-01 10:27:10.180 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:11.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:11.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:11 np0005540826 nova_compute[229148]: 2025-12-01 10:27:11.987 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:13.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:13.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:15.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:15 np0005540826 nova_compute[229148]: 2025-12-01 10:27:15.207 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:15 np0005540826 ovn_controller[132309]: 2025-12-01T10:27:15Z|00118|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  1 05:27:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:15.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:16 np0005540826 nova_compute[229148]: 2025-12-01 10:27:16.992 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:17.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:19.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:19.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:20 np0005540826 nova_compute[229148]: 2025-12-01 10:27:20.251 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:21.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:27:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:27:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:21 np0005540826 nova_compute[229148]: 2025-12-01 10:27:21.996 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec  1 05:27:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:23.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec  1 05:27:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:27:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:23 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:27:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:23.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:24 np0005540826 podman[244125]: 2025-12-01 10:27:24.985495476 +0000 UTC m=+0.065200247 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:27:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:25.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:25 np0005540826 nova_compute[229148]: 2025-12-01 10:27:25.253 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:25.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:27 np0005540826 nova_compute[229148]: 2025-12-01 10:27:27.000 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:27.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:27.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:29 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:29 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:29.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:30 np0005540826 nova_compute[229148]: 2025-12-01 10:27:30.256 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:31.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:31.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:32 np0005540826 nova_compute[229148]: 2025-12-01 10:27:32.003 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:33.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:33.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:34 np0005540826 podman[244197]: 2025-12-01 10:27:34.007981081 +0000 UTC m=+0.094693175 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:27:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:35.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:35 np0005540826 nova_compute[229148]: 2025-12-01 10:27:35.260 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:35.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:36 np0005540826 podman[244226]: 2025-12-01 10:27:36.971125753 +0000 UTC m=+0.057734817 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  1 05:27:37 np0005540826 nova_compute[229148]: 2025-12-01 10:27:37.006 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:27:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:37.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:27:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:37.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:39.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:27:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:39.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:27:40 np0005540826 nova_compute[229148]: 2025-12-01 10:27:40.262 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:41.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:41.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:42 np0005540826 nova_compute[229148]: 2025-12-01 10:27:42.009 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:43.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:43.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:45.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:45 np0005540826 nova_compute[229148]: 2025-12-01 10:27:45.264 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:45.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:47 np0005540826 nova_compute[229148]: 2025-12-01 10:27:47.012 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:47.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:47 np0005540826 systemd-logind[787]: New session 55 of user zuul.
Dec  1 05:27:47 np0005540826 systemd[1]: Started Session 55 of User zuul.
Dec  1 05:27:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:47.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:49.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:27:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:49.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:27:50 np0005540826 nova_compute[229148]: 2025-12-01 10:27:50.265 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:27:51 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2639347518' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:27:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:51.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:52 np0005540826 nova_compute[229148]: 2025-12-01 10:27:52.015 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:27:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:27:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:55 np0005540826 nova_compute[229148]: 2025-12-01 10:27:55.267 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:55 np0005540826 podman[244628]: 2025-12-01 10:27:55.569544931 +0000 UTC m=+0.073007384 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec  1 05:27:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:55.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:56 np0005540826 ovs-vsctl[244676]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 05:27:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:57 np0005540826 nova_compute[229148]: 2025-12-01 10:27:57.019 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:27:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:57.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:57 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:57.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: cache status {prefix=cache status} (starting...)
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: client ls {prefix=client ls} (starting...)
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:58 np0005540826 lvm[244994]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:27:58 np0005540826 lvm[244994]: VG ceph_vg0 finished
Dec  1 05:27:58 np0005540826 nova_compute[229148]: 2025-12-01 10:27:58.475 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 05:27:58 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 nova_compute[229148]: 2025-12-01 10:27:59.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:59 np0005540826 nova_compute[229148]: 2025-12-01 10:27:59.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:27:59 np0005540826 nova_compute[229148]: 2025-12-01 10:27:59.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/183974291' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 05:27:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:59 np0005540826 nova_compute[229148]: 2025-12-01 10:27:59.124 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3111091241' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:27:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  1 05:27:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3584704780' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 05:27:59 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: ops {prefix=ops} (starting...)
Dec  1 05:28:00 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540826 nova_compute[229148]: 2025-12-01 10:28:00.268 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3608231201' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3149209817' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 05:28:00 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: session ls {prefix=session ls} (starting...)
Dec  1 05:28:00 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:28:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1179325203' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:28:00 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: status {prefix=status} (starting...)
Dec  1 05:28:01 np0005540826 nova_compute[229148]: 2025-12-01 10:28:01.119 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2521562130' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2807277753' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 05:28:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:01.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:28:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1042598284' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:28:02 np0005540826 nova_compute[229148]: 2025-12-01 10:28:02.108 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:02 np0005540826 nova_compute[229148]: 2025-12-01 10:28:02.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:02 np0005540826 nova_compute[229148]: 2025-12-01 10:28:02.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:02 np0005540826 nova_compute[229148]: 2025-12-01 10:28:02.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/298894126' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2552589729' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  1 05:28:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4102015110' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1812810523' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 05:28:03 np0005540826 nova_compute[229148]: 2025-12-01 10:28:03.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:03 np0005540826 nova_compute[229148]: 2025-12-01 10:28:03.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/696210361' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/415455245' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 05:28:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:03.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4008813133' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:28:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2753251066' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:28:04 np0005540826 nova_compute[229148]: 2025-12-01 10:28:04.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:04 np0005540826 nova_compute[229148]: 2025-12-01 10:28:04.108 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:28:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4130838568' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 933888 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78741504 unmapped: 909312 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78741504 unmapped: 909312 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78749696 unmapped: 901120 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.093087196s of 10.114652634s, submitted: 8
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 884736 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 921397 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 884736 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 876544 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78815232 unmapped: 835584 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 827392 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78848000 unmapped: 802816 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922777 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78872576 unmapped: 778240 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78872576 unmapped: 778240 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922761 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.615036011s of 11.657299042s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78880768 unmapped: 770048 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 761856 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 761856 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922629 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922629 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b7800 session 0x55e0b4a8d2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78757888 unmapped: 892928 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 884736 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 884736 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 876544 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922629 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 876544 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 876544 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78782464 unmapped: 868352 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78782464 unmapped: 868352 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 860160 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922629 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 860160 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 860160 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.610450745s of 21.635799408s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78798848 unmapped: 851968 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924289 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 819200 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,0,0,0,0,1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924289 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78839808 unmapped: 811008 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 794624 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 794624 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 786432 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.940896034s of 12.041529655s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,0,0,0,0,1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78872576 unmapped: 778240 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923682 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78872576 unmapped: 778240 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 761856 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 761856 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78897152 unmapped: 753664 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78905344 unmapped: 745472 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78913536 unmapped: 737280 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 729088 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78929920 unmapped: 720896 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 712704 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 712704 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 712704 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78946304 unmapped: 704512 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78946304 unmapped: 704512 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78946304 unmapped: 704512 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 696320 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 696320 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78962688 unmapped: 688128 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78962688 unmapped: 688128 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78962688 unmapped: 688128 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78970880 unmapped: 679936 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78970880 unmapped: 679936 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 671744 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 663552 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 663552 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 655360 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 655360 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 655360 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 647168 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 638976 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 638976 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 638976 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b78b3860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 630784 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 630784 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 622592 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 622592 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 622592 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 614400 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 614400 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 614400 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923550 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 606208 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 606208 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 53.467578888s of 53.491367340s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 598016 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 598016 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 598016 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925210 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 573440 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 573440 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 573440 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 557056 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 557056 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926722 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 557056 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 548864 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 548864 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 540672 heap: 79650816 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.005735397s of 12.085139275s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 80158720 unmapped: 540672 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926115 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79126528 unmapped: 1572864 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79134720 unmapped: 1564672 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5230000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925983 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79142912 unmapped: 1556480 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79151104 unmapped: 1548288 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79159296 unmapped: 1540096 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925983 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79167488 unmapped: 1531904 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79175680 unmapped: 1523712 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79175680 unmapped: 1523712 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79183872 unmapped: 1515520 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79183872 unmapped: 1515520 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925983 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79183872 unmapped: 1515520 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.151161194s of 16.393291473s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79208448 unmapped: 1490944 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79208448 unmapped: 1490944 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79208448 unmapped: 1490944 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926131 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79216640 unmapped: 1482752 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79224832 unmapped: 1474560 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79233024 unmapped: 1466368 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79241216 unmapped: 1458176 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925372 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 1441792 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.006743431s of 12.036339760s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 1433600 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79273984 unmapped: 1425408 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924933 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79273984 unmapped: 1425408 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79273984 unmapped: 1425408 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79282176 unmapped: 1417216 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79290368 unmapped: 1409024 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79290368 unmapped: 1409024 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79298560 unmapped: 1400832 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79306752 unmapped: 1392640 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79306752 unmapped: 1392640 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 1384448 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 7126 writes, 30K keys, 7126 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 7126 writes, 1175 syncs, 6.06 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7126 writes, 30K keys, 7126 commit groups, 1.0 writes per commit group, ingest: 20.44 MB, 0.03 MB/s#012Interval WAL: 7126 writes, 1175 syncs, 6.06 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79372288 unmapped: 1327104 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79380480 unmapped: 1318912 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79380480 unmapped: 1318912 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79388672 unmapped: 1310720 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79388672 unmapped: 1310720 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79388672 unmapped: 1310720 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79396864 unmapped: 1302528 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79396864 unmapped: 1302528 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79405056 unmapped: 1294336 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79405056 unmapped: 1294336 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79413248 unmapped: 1286144 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79421440 unmapped: 1277952 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79421440 unmapped: 1277952 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79429632 unmapped: 1269760 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79429632 unmapped: 1269760 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79429632 unmapped: 1269760 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79446016 unmapped: 1253376 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79446016 unmapped: 1253376 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79446016 unmapped: 1253376 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79454208 unmapped: 1245184 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79454208 unmapped: 1245184 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79462400 unmapped: 1236992 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79462400 unmapped: 1236992 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79462400 unmapped: 1236992 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79470592 unmapped: 1228800 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79470592 unmapped: 1228800 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79486976 unmapped: 1212416 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79503360 unmapped: 1196032 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79503360 unmapped: 1196032 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79503360 unmapped: 1196032 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79519744 unmapped: 1179648 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79519744 unmapped: 1179648 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79527936 unmapped: 1171456 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79527936 unmapped: 1171456 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79536128 unmapped: 1163264 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79544320 unmapped: 1155072 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b7941a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79544320 unmapped: 1155072 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79544320 unmapped: 1155072 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79552512 unmapped: 1146880 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79552512 unmapped: 1146880 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79568896 unmapped: 1130496 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79568896 unmapped: 1130496 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79577088 unmapped: 1122304 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79577088 unmapped: 1122304 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 69.855529785s of 69.865516663s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79601664 unmapped: 1097728 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79601664 unmapped: 1097728 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924949 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79618048 unmapped: 1081344 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79626240 unmapped: 1073152 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924949 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79634432 unmapped: 1064960 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79486976 unmapped: 1212416 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79486976 unmapped: 1212416 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924949 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79495168 unmapped: 1204224 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79503360 unmapped: 1196032 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.515069008s of 14.617650032s, submitted: 9
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79511552 unmapped: 1187840 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79519744 unmapped: 1179648 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79519744 unmapped: 1179648 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924649 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79519744 unmapped: 1179648 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79527936 unmapped: 1171456 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79527936 unmapped: 1171456 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79536128 unmapped: 1163264 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79536128 unmapped: 1163264 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18400 session 0x55e0b7a03c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79536128 unmapped: 1163264 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b5252780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79552512 unmapped: 1146880 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79552512 unmapped: 1146880 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79560704 unmapped: 1138688 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79568896 unmapped: 1130496 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79568896 unmapped: 1130496 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79577088 unmapped: 1122304 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79577088 unmapped: 1122304 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924801 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79585280 unmapped: 1114112 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.489625931s of 19.649112701s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79585280 unmapped: 1114112 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79585280 unmapped: 1114112 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79593472 unmapped: 1105920 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926593 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79601664 unmapped: 1097728 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79601664 unmapped: 1097728 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79609856 unmapped: 1089536 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79642624 unmapped: 1056768 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79642624 unmapped: 1056768 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926593 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79650816 unmapped: 1048576 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79634432 unmapped: 1064960 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79642624 unmapped: 1056768 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79642624 unmapped: 1056768 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79642624 unmapped: 1056768 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.540737152s of 13.827812195s, submitted: 13
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926445 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79650816 unmapped: 1048576 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79659008 unmapped: 1040384 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79667200 unmapped: 1032192 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79667200 unmapped: 1032192 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79667200 unmapped: 1032192 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926313 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79675392 unmapped: 1024000 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79675392 unmapped: 1024000 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79675392 unmapped: 1024000 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79683584 unmapped: 1015808 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79699968 unmapped: 999424 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926313 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79699968 unmapped: 999424 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b7c00 session 0x55e0b81f1e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79699968 unmapped: 999424 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79708160 unmapped: 991232 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79708160 unmapped: 991232 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79708160 unmapped: 991232 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926313 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 983040 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 983040 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79716352 unmapped: 983040 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79724544 unmapped: 974848 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926313 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.887556076s of 21.894308090s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926461 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79732736 unmapped: 966656 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927973 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927366 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79749120 unmapped: 950272 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.272251129s of 14.991144180s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79757312 unmapped: 942080 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.911170959s of 11.030269623s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 79831040 unmapped: 868352 heap: 80699392 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,0,0,0,0,0,1,0,0,2])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 876544 heap: 81747968 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81092608 unmapped: 1703936 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81215488 unmapped: 1581056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81215488 unmapped: 1581056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81215488 unmapped: 1581056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81215488 unmapped: 1581056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81215488 unmapped: 1581056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b8248d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5156960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927234 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81223680 unmapped: 1572864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.574874878s of 23.242883682s, submitted: 207
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81256448 unmapped: 1540096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81256448 unmapped: 1540096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81256448 unmapped: 1540096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81272832 unmapped: 1523712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927514 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81289216 unmapped: 1507328 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81289216 unmapped: 1507328 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81297408 unmapped: 1499136 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81297408 unmapped: 1499136 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81297408 unmapped: 1499136 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927514 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81305600 unmapped: 1490944 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81305600 unmapped: 1490944 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81305600 unmapped: 1490944 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.755687714s of 12.057899475s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 1425408 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 1425408 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928435 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 1425408 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81420288 unmapped: 1376256 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81420288 unmapped: 1376256 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81420288 unmapped: 1376256 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928155 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928155 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18400 session 0x55e0b7af3680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928155 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928155 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.651508331s of 24.699497223s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928287 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81436672 unmapped: 1359872 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81444864 unmapped: 1351680 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81444864 unmapped: 1351680 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931327 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81461248 unmapped: 1335296 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81469440 unmapped: 1327104 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81469440 unmapped: 1327104 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81469440 unmapped: 1327104 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930720 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.107837677s of 13.236147881s, submitted: 12
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b711d000 session 0x55e0b81f14a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930588 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 38.510501862s of 38.514125824s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930720 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932248 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee000 session 0x55e0b7bfa960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931489 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.001915932s of 12.080320358s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930918 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81600512 unmapped: 1196032 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931066 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.010660172s of 11.032484055s, submitted: 7
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81657856 unmapped: 1138688 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930307 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81682432 unmapped: 1114112 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81715200 unmapped: 1081344 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81715200 unmapped: 1081344 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81715200 unmapped: 1081344 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7bfa3c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81723392 unmapped: 1073152 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930327 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 62.790245056s of 62.876361847s, submitted: 6
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81739776 unmapped: 1056768 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81756160 unmapped: 1040384 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930475 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931687 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:28:04.562 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:28:04.562 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:28:04.562 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7d2e780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18400 session 0x55e0b7d343c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931839 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81829888 unmapped: 966656 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 61.460285187s of 61.512180328s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81838080 unmapped: 958464 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81838080 unmapped: 958464 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81838080 unmapped: 958464 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933631 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933631 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.953406334s of 11.992015839s, submitted: 12
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81846272 unmapped: 950272 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933331 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7d352c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7bfe1e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b5290800 session 0x55e0b524e5a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.795988083s of 21.811857224s, submitted: 4
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933024 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934536 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935305 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.926730156s of 14.144864082s, submitted: 15
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b7d2e000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935325 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935193 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.862897873s of 11.870977402s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935325 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935341 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934582 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.008006096s of 12.044129372s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b53a45a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 62.043544769s of 62.273132324s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7f1af00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934159 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.962292671s of 10.977412224s, submitted: 4
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934291 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 688128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 688128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934159 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933991 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.645601273s of 14.703873634s, submitted: 13
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b5e52b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b7800 session 0x55e0b524e5a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.513740540s of 18.526220322s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935803 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 1646592 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 7887 writes, 31K keys, 7887 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7887 writes, 1549 syncs, 5.09 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 761 writes, 1336 keys, 761 commit groups, 1.0 writes per commit group, ingest: 0.56 MB, 0.00 MB/s#012Interval WAL: 761 writes, 374 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 1613824 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935803 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.910443306s of 12.943789482s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937167 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 1556480 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 1556480 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b81f1860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000049s
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.295097351s of 29.695419312s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938104 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 1515520 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937345 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.566009521s of 13.605023384s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b8283680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 1482752 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 1482752 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.388652802s of 40.391891479s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937497 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937513 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 1400832 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936754 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.199851990s of 13.357616425s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b7da9e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7da94a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.696914673s of 14.699798584s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937038 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940078 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.845451355s of 12.939780235s, submitted: 14
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939930 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.911399841s of 24.919387817s, submitted: 2
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 1146880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b7da8960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 1105920 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b4d35e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 31.330074310s of 32.056125641s, submitted: 220
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 1089536 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 1089536 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939946 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940078 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.306317329s of 12.338130951s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938896 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b7d354a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b4a8d2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 55.401966095s of 55.432224274s, submitted: 9
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 1007616 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940240 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940408 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.045467377s of 12.250482559s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 958464 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940392 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 958464 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83959808 unmapped: 933888 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b832a780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.239133835s of 22.275657654s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939685 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939685 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.702522278s of 14.757088661s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939385 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b82b0d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.179632187s of 20.300897598s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941197 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941197 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.630328178s of 14.677761078s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940897 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b82db0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941049 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941049 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.983519554s of 15.986623764s, submitted: 1
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 138 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7f1be00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84246528 unmapped: 17432576 heap: 101679104 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 138 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82a7680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84320256 unmapped: 25755648 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1061229 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84320256 unmapped: 25755648 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 140 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b7d7da40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb669000/0x0/0x4ffc00000, data 0x10ee025/0x11a0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064251 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb667000/0x0/0x4ffc00000, data 0x10f01a6/0x11a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb667000/0x0/0x4ffc00000, data 0x10f01a6/0x11a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.795370102s of 12.924592018s, submitted: 30
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066185 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066185 data_alloc: 218103808 data_used: 151552
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84402176 unmapped: 25673728 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.582818031s of 11.626168251s, submitted: 21
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066169 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84418560 unmapped: 25657344 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84418560 unmapped: 25657344 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84426752 unmapped: 25649152 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b7d443c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.755990982s of 18.777509689s, submitted: 5
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065578 data_alloc: 218103808 data_used: 155648
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7bb7680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b5254d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb661000/0x0/0x4ffc00000, data 0x10f4310/0x11aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,3])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 17489920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b82a6960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b82b1a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170125 data_alloc: 218103808 data_used: 6971392
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b59094a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fabdc000/0x0/0x4ffc00000, data 0x1b76508/0x1c2e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bff0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fabdc000/0x0/0x4ffc00000, data 0x1b76508/0x1c2e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170125 data_alloc: 218103808 data_used: 6971392
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b8230960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781499863s of 13.647070885s, submitted: 47
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b8248960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92397568 unmapped: 17678336 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 17670144 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 17661952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101523456 unmapped: 8552448 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251826 data_alloc: 234881024 data_used: 16932864
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251978 data_alloc: 234881024 data_used: 16936960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 8429568 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.211468697s of 13.280833244s, submitted: 18
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103358464 unmapped: 6717440 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279464 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b830be00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103358464 unmapped: 6717440 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280690 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280690 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.150278091s of 12.817354202s, submitted: 32
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b82a7c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82301e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b82d8000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b82483c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104292352 unmapped: 5783552 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b7d7c000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b82b14a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 4603904 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82d9c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.499177933s of 15.732673645s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b4d35e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b823a1e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b859d680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b823be00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b81f14a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302476 data_alloc: 234881024 data_used: 17567744
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60e000/0x0/0x4ffc00000, data 0x2143550/0x21fe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60e000/0x0/0x4ffc00000, data 0x2143550/0x21fe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60c000/0x0/0x4ffc00000, data 0x2144550/0x21ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60c000/0x0/0x4ffc00000, data 0x2144550/0x21ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b859d2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104538112 unmapped: 7159808 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305444 data_alloc: 234881024 data_used: 17567744
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104554496 unmapped: 7143424 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104554496 unmapped: 7143424 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.507975578s of 10.574006081s, submitted: 14
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104546304 unmapped: 7151616 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104546304 unmapped: 7151616 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 6635520 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313036 data_alloc: 234881024 data_used: 18292736
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106315776 unmapped: 5382144 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106315776 unmapped: 5382144 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1322308 data_alloc: 234881024 data_used: 19689472
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.600978851s of 10.616786003s, submitted: 5
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 1728512 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343706 data_alloc: 234881024 data_used: 19714048
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 5341184 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112779264 unmapped: 4915200 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112779264 unmapped: 4915200 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400442 data_alloc: 234881024 data_used: 20185088
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7d7d680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07000 session 0x55e0b7940f00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400458 data_alloc: 234881024 data_used: 20185088
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.637441635s of 11.983428001s, submitted: 64
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110370816 unmapped: 7323648 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eef800 session 0x55e0b7d2f2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290388 data_alloc: 234881024 data_used: 17440768
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b7d454a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7d7d2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103915520 unmapped: 13778944 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122142 data_alloc: 218103808 data_used: 7475200
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b85b9400 session 0x55e0b7a021e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 32.091907501s of 32.341941833s, submitted: 45
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7d45a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7bfe3c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7cc5a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b8231860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7f1a5a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b859c960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167432 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7d521e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167432 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7d7cd20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b523ba40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7d7c5a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103702528 unmapped: 22462464 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1214228 data_alloc: 234881024 data_used: 14315520
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1214228 data_alloc: 234881024 data_used: 14315520
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7d2e3c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b7d2e1e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.142061234s of 22.214815140s, submitted: 17
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110362624 unmapped: 15802368 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255460 data_alloc: 234881024 data_used: 14860288
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110510080 unmapped: 15654912 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f994e000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267452 data_alloc: 234881024 data_used: 15208448
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263052 data_alloc: 234881024 data_used: 15216640
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.508072853s of 12.713724136s, submitted: 69
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263068 data_alloc: 234881024 data_used: 15212544
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109084672 unmapped: 17080320 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264284 data_alloc: 234881024 data_used: 15290368
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.391874313s of 11.401729584s, submitted: 3
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b4a8c1e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7bb7c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 26.203746796s of 26.265848160s, submitted: 25
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7bb8780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188520 data_alloc: 218103808 data_used: 6967296
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b78b3c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188520 data_alloc: 218103808 data_used: 6967296
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b52334a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7943e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b6e87e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 26116096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 26116096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250374 data_alloc: 234881024 data_used: 15360000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 9259 writes, 35K keys, 9259 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 9259 writes, 2171 syncs, 4.26 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1372 writes, 3774 keys, 1372 commit groups, 1.0 writes per commit group, ingest: 3.39 MB, 0.01 MB/s#012Interval WAL: 1372 writes, 622 syncs, 2.21 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250374 data_alloc: 234881024 data_used: 15360000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.684480667s of 18.732368469s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111058944 unmapped: 19308544 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 19161088 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310776 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e2000/0x0/0x4ffc00000, data 0x21d14de/0x228a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b52550e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bfe960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b5d7b2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b79d3e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.777948380s of 22.418426514s, submitted: 57
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b79d3c20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b79d3860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b79d3680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bb4780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b6f4da40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c1000/0x0/0x4ffc00000, data 0x21f14ee/0x22ab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314878 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c1000/0x0/0x4ffc00000, data 0x21f14ee/0x22ab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 19963904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7af2960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318296 data_alloc: 234881024 data_used: 15601664
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110444544 unmapped: 19922944 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318600 data_alloc: 234881024 data_used: 15634432
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318600 data_alloc: 234881024 data_used: 15634432
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.989015579s of 22.041868210s, submitted: 12
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 16752640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 16736256 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1349310 data_alloc: 234881024 data_used: 15659008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8fe0000/0x0/0x4ffc00000, data 0x25d1511/0x268c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1349976 data_alloc: 234881024 data_used: 15659008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7bb4000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5234780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8fe0000/0x0/0x4ffc00000, data 0x25d1511/0x268c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 17350656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 17350656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b82dab40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9329000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316518 data_alloc: 234881024 data_used: 15597568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9329000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7bb83c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b52530e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.346494675s of 14.904012680s, submitted: 67
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b7af21e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6f4bc20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7af0f00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7af10e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b515b4a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.613090515s of 25.649431229s, submitted: 10
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 21266432 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5d7a5a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167807 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7af0d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.353801727s of 17.207458496s, submitted: 14
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110215168 unmapped: 20152320 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108879872 unmapped: 21487616 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250646 data_alloc: 234881024 data_used: 10629120
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718fc00 session 0x55e0b81a0960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7d2fe00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b5e4c000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7d45860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.747030258s of 24.820497513s, submitted: 77
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325345 data_alloc: 234881024 data_used: 10629120
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109453312 unmapped: 20914176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b82825a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b7af0b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b7bb81e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b6e87680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b6f4b2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 20897792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 20897792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 20889600 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309713 data_alloc: 234881024 data_used: 10633216
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291400 session 0x55e0b5e530e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18400 session 0x55e0b732e780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367489 data_alloc: 234881024 data_used: 19181568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a29400 session 0x55e0b5157680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367489 data_alloc: 234881024 data_used: 19181568
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.074026108s of 17.169523239s, submitted: 31
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116539392 unmapped: 13828096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119160832 unmapped: 11206656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 10911744 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1425171 data_alloc: 234881024 data_used: 19357696
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 10813440 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5d45400 session 0x55e0b5909680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 10813440 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 10780672 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119619584 unmapped: 10747904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119619584 unmapped: 10747904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422259 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118882304 unmapped: 11485184 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422259 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 11403264 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.915831566s of 17.032249451s, submitted: 304
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422427 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.685111046s of 15.706851959s, submitted: 6
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.836561203s of 16.850162506s, submitted: 4
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b793ef00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6c52400 session 0x55e0b5230b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254569 data_alloc: 234881024 data_used: 10612736
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5e000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5e000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254781 data_alloc: 234881024 data_used: 10612736
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5f000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b7af01e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.366408348s of 11.507596970s, submitted: 22
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7a030e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5f000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113172480 unmapped: 17195008 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b793fe00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.599931717s of 29.781423569s, submitted: 23
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7d2e960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b5e52000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x16194f7/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194498 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b793f680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f4b0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7af05a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b6f4af00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203823 data_alloc: 218103808 data_used: 7593984
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: mgrc ms_handle_reset ms_handle_reset con 0x55e0b711d400
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: mgrc handle_mgr_configure stats_period=5
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b82d90e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.048700333s of 12.292609215s, submitted: 36
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b515cd20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231811 data_alloc: 234881024 data_used: 11739136
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110141440 unmapped: 23904256 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5e52000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154953 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154953 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6e86960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b6e861e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6e872c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.875237465s of 12.953323364s, submitted: 29
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6e86000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b5d7b2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b5d7a780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6f492c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f48960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0f0000/0x0/0x4ffc00000, data 0x14c2507/0x157c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192120 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6f4b0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa06d000/0x0/0x4ffc00000, data 0x1545540/0x15ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6f4af00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07000 session 0x55e0b7af01e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa06d000/0x0/0x4ffc00000, data 0x1545540/0x15ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5230b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193988 data_alloc: 218103808 data_used: 6451200
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa049000/0x0/0x4ffc00000, data 0x1569540/0x1623000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b5157680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b523ad20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.350914001s of 11.770071983s, submitted: 28
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162740 data_alloc: 218103808 data_used: 6447104
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0cd000/0x0/0x4ffc00000, data 0x111c4ce/0x11d4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6e87a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06000 session 0x55e0b6a14960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b4d34780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7bb8960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 32.203327179s of 32.274513245s, submitted: 15
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b5233860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110624768 unmapped: 23420928 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b515d0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82a83c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b515d2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5157860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e87000/0x0/0x4ffc00000, data 0x172c4de/0x17e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218431 data_alloc: 218103808 data_used: 6447104
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 24076288 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b53a4000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111017984 unmapped: 23027712 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 21078016 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255024 data_alloc: 234881024 data_used: 11526144
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255024 data_alloc: 234881024 data_used: 11526144
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.454790115s of 18.098155975s, submitted: 35
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318202 data_alloc: 234881024 data_used: 11956224
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116703232 unmapped: 17342464 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 18079744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328094 data_alloc: 234881024 data_used: 12349440
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326470 data_alloc: 234881024 data_used: 12361728
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9177000/0x0/0x4ffc00000, data 0x202b501/0x20e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.622363091s of 14.404953957s, submitted: 103
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6f4ba40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115523584 unmapped: 18522112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325768 data_alloc: 234881024 data_used: 12357632
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112304128 unmapped: 21741568 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f4b0e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5157a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b5156960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b7c00 session 0x55e0b5238b40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.455039978s of 27.564855576s, submitted: 39
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7af01e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b8a000/0x0/0x4ffc00000, data 0x161a4ce/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b5a603c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5a605a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 21487616 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b73401e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b8a000/0x0/0x4ffc00000, data 0x161a4ce/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b5d7b2c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112721920 unmapped: 21323776 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 21315584 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 21315584 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254877 data_alloc: 234881024 data_used: 11730944
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254877 data_alloc: 234881024 data_used: 11730944
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.325626373s of 18.633775711s, submitted: 19
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293411 data_alloc: 234881024 data_used: 11763712
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,7,1])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116752384 unmapped: 17293312 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f96a2000/0x0/0x4ffc00000, data 0x1b014de/0x1bba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118398976 unmapped: 15646720 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118398976 unmapped: 15646720 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b515b680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b515af00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b732e780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b732f680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b732f860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b7af34a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118988800 unmapped: 28704768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b523a780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b5e52f00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b6f48f00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1363146 data_alloc: 234881024 data_used: 12959744
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d9e000/0x0/0x4ffc00000, data 0x24054de/0x24be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b524e1e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 30056448 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367903 data_alloc: 234881024 data_used: 12959744
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 30056448 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 23248896 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 23216128 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 23216128 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1431135 data_alloc: 234881024 data_used: 22302720
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 23183360 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1431135 data_alloc: 234881024 data_used: 22302720
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.316621780s of 22.041751862s, submitted: 66
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 126763008 unmapped: 20930560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 19341312 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128819200 unmapped: 18874368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475013 data_alloc: 234881024 data_used: 22556672
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128819200 unmapped: 18874368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a0f000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a0f000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475029 data_alloc: 234881024 data_used: 22556672
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128860160 unmapped: 18833408 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128860160 unmapped: 18833408 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.479086876s of 10.626307487s, submitted: 52
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128892928 unmapped: 18800640 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128221184 unmapped: 19472384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128221184 unmapped: 19472384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a25000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467973 data_alloc: 234881024 data_used: 22556672
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128229376 unmapped: 19464192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128229376 unmapped: 19464192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b5252d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b515d680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128237568 unmapped: 19456000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7bff860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307237 data_alloc: 234881024 data_used: 12959744
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6ee8d20
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6ee83c0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.750218391s of 10.112176895s, submitted: 57
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b4a8d680
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa088000/0x0/0x4ffc00000, data 0x111c4ce/0x11d4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5e52000
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f49a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7cc45a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b4a8d860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.401563644s of 24.434398651s, submitted: 11
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b6f46780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198392 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6f47a40
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f47e00
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119111680 unmapped: 28581888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b6f474a0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4792000 session 0x55e0b6f461e0
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 29155328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207611 data_alloc: 218103808 data_used: 7229440
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207611 data_alloc: 218103808 data_used: 7229440
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.172168732s of 18.204217911s, submitted: 12
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265083 data_alloc: 218103808 data_used: 7229440
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 27262976 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97b1000/0x0/0x4ffc00000, data 0x19f24de/0x1aab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278635 data_alloc: 218103808 data_used: 8278016
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278635 data_alloc: 218103808 data_used: 8278016
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119922688 unmapped: 27770880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7af2780
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954920769s of 15.095984459s, submitted: 60
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b7af2960
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278503 data_alloc: 218103808 data_used: 8278016
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119930880 unmapped: 27762688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b793f860
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}'
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117399552 unmapped: 30294016 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:28:04 np0005540826 ceph-osd[77525]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:28:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:28:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3568902322' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:28:05 np0005540826 podman[246085]: 2025-12-01 10:28:05.044917322 +0000 UTC m=+0.125718272 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:28:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:05.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:28:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/274395677' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:28:05 np0005540826 nova_compute[229148]: 2025-12-01 10:28:05.269 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:28:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2453994286' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:28:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:05.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  1 05:28:06 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2172107929' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 05:28:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2172100142' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2608203925' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2608203925' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:28:07 np0005540826 nova_compute[229148]: 2025-12-01 10:28:07.111 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:07.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/794302548' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 05:28:07 np0005540826 podman[246549]: 2025-12-01 10:28:07.710042512 +0000 UTC m=+0.063179202 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  1 05:28:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2770499777' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 05:28:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:07.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3798853096' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/770018958' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2378032138' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/393227238' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  1 05:28:08 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1371982183' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/312117713' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.129 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.130 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.130 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.130 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.130 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:28:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:09.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3388144832' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  1 05:28:09 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4268465877' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:09 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/200579190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.582 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:28:09 np0005540826 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.739 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.740 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4655MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.741 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.741 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2269291642' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.821 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.822 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:28:09 np0005540826 nova_compute[229148]: 2025-12-01 10:28:09.837 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:28:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:09.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  1 05:28:09 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/806825132' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2625422848' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.272 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2311727613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.299 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.304 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.327 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.328 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:28:10 np0005540826 nova_compute[229148]: 2025-12-01 10:28:10.328 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  1 05:28:10 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/301400875' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  1 05:28:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:11.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:11.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  1 05:28:11 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2746776080' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  1 05:28:12 np0005540826 nova_compute[229148]: 2025-12-01 10:28:12.113 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:12 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  1 05:28:12 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3458671968' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  1 05:28:12 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:12 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1132174747' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:13.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:13 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  1 05:28:13 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3672121130' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  1 05:28:13 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:13 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:13.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3320534148' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  1 05:28:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2455707800' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  1 05:28:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:15.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:15 np0005540826 nova_compute[229148]: 2025-12-01 10:28:15.272 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2646154821' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  1 05:28:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:15.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec  1 05:28:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1300857708' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec  1 05:28:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec  1 05:28:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3135176345' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec  1 05:28:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:17 np0005540826 nova_compute[229148]: 2025-12-01 10:28:17.117 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:17.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec  1 05:28:17 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3841472996' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec  1 05:28:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec  1 05:28:17 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2896343475' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec  1 05:28:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:17.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec  1 05:28:18 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3202030290' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec  1 05:28:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:19.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec  1 05:28:19 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2630146053' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec  1 05:28:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:19.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec  1 05:28:20 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/494296196' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec  1 05:28:20 np0005540826 nova_compute[229148]: 2025-12-01 10:28:20.274 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:20 np0005540826 ovs-appctl[248798]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:20 np0005540826 ovs-appctl[248806]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:20 np0005540826 ovs-appctl[248814]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:21.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Dec  1 05:28:21 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/241861749' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec  1 05:28:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec  1 05:28:21 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2911707120' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec  1 05:28:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:21.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:22 np0005540826 nova_compute[229148]: 2025-12-01 10:28:22.119 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:22 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:28:22 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2400025989' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:28:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:23.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:23 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec  1 05:28:23 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/863008765' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec  1 05:28:23 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec  1 05:28:23 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4163488074' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:23.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:24 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/856131792' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:24 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec  1 05:28:24 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3750332313' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec  1 05:28:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:25.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:25 np0005540826 nova_compute[229148]: 2025-12-01 10:28:25.276 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec  1 05:28:25 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1523324552' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:25 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec  1 05:28:25 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2198529714' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:25.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:25 np0005540826 podman[250380]: 2025-12-01 10:28:25.990085232 +0000 UTC m=+0.071731515 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Dec  1 05:28:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:26 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 11K writes, 3071 syncs, 3.70 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2114 writes, 7094 keys, 2114 commit groups, 1.0 writes per commit group, ingest: 8.13 MB, 0.01 MB/s#012Interval WAL: 2114 writes, 900 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:28:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec  1 05:28:26 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/446968811' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec  1 05:28:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:27 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec  1 05:28:27 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2609922541' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:27 np0005540826 nova_compute[229148]: 2025-12-01 10:28:27.122 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:27.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:27.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:28 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec  1 05:28:28 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/805398601' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:29.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec  1 05:28:29 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2946772512' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:29 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec  1 05:28:29 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3869686804' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:29.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:30 np0005540826 nova_compute[229148]: 2025-12-01 10:28:30.317 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:30 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3026284103' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec  1 05:28:31 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3197457651' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec  1 05:28:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:31.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:31 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:28:31 np0005540826 systemd[1]: Starting Time & Date Service...
Dec  1 05:28:31 np0005540826 systemd[1]: Started Time & Date Service.
Dec  1 05:28:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:31.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:32 np0005540826 nova_compute[229148]: 2025-12-01 10:28:32.124 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:32 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec  1 05:28:32 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3615877885' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:33 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec  1 05:28:33 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4148240341' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:33.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:35.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:35 np0005540826 nova_compute[229148]: 2025-12-01 10:28:35.321 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:35.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:36 np0005540826 podman[251466]: 2025-12-01 10:28:36.018845529 +0000 UTC m=+0.099680969 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:28:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:37 np0005540826 nova_compute[229148]: 2025-12-01 10:28:37.127 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:37.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:37 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:37 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:37.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:37 np0005540826 podman[251518]: 2025-12-01 10:28:37.956985308 +0000 UTC m=+0.045809290 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:28:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:39.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:40 np0005540826 nova_compute[229148]: 2025-12-01 10:28:40.323 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:41.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:41.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:42 np0005540826 nova_compute[229148]: 2025-12-01 10:28:42.131 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:43.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:45.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:45 np0005540826 nova_compute[229148]: 2025-12-01 10:28:45.324 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:47 np0005540826 nova_compute[229148]: 2025-12-01 10:28:47.133 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:47.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:47.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:49.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:50 np0005540826 nova_compute[229148]: 2025-12-01 10:28:50.361 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:51.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:51.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:52 np0005540826 nova_compute[229148]: 2025-12-01 10:28:52.136 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:53.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:53.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:55.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:55 np0005540826 nova_compute[229148]: 2025-12-01 10:28:55.363 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:55.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:28:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:56 np0005540826 podman[251572]: 2025-12-01 10:28:56.993935945 +0000 UTC m=+0.071722495 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:28:57 np0005540826 nova_compute[229148]: 2025-12-01 10:28:57.146 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:28:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:57.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:57.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:59.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:59 np0005540826 nova_compute[229148]: 2025-12-01 10:28:59.329 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:59 np0005540826 nova_compute[229148]: 2025-12-01 10:28:59.330 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:28:59 np0005540826 nova_compute[229148]: 2025-12-01 10:28:59.330 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:28:59 np0005540826 nova_compute[229148]: 2025-12-01 10:28:59.343 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:28:59 np0005540826 nova_compute[229148]: 2025-12-01 10:28:59.344 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:28:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:28:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:59.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:00 np0005540826 nova_compute[229148]: 2025-12-01 10:29:00.399 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:01.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:01 np0005540826 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 05:29:01 np0005540826 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 05:29:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:02 np0005540826 nova_compute[229148]: 2025-12-01 10:29:02.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:02 np0005540826 nova_compute[229148]: 2025-12-01 10:29:02.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:02 np0005540826 nova_compute[229148]: 2025-12-01 10:29:02.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:29:02 np0005540826 nova_compute[229148]: 2025-12-01 10:29:02.149 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:03 np0005540826 nova_compute[229148]: 2025-12-01 10:29:03.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:03.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:29:04.562 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:29:04.563 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:29:04.563 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:05 np0005540826 nova_compute[229148]: 2025-12-01 10:29:05.104 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:05 np0005540826 nova_compute[229148]: 2025-12-01 10:29:05.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:05.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:05 np0005540826 nova_compute[229148]: 2025-12-01 10:29:05.399 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:06 np0005540826 nova_compute[229148]: 2025-12-01 10:29:06.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:07 np0005540826 podman[251601]: 2025-12-01 10:29:07.053459807 +0000 UTC m=+0.129062630 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:29:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:29:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2726321627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:29:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:29:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2726321627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:29:07 np0005540826 nova_compute[229148]: 2025-12-01 10:29:07.149 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:07.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:08 np0005540826 podman[251627]: 2025-12-01 10:29:08.389760901 +0000 UTC m=+0.043058871 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:29:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:09.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:09.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:10 np0005540826 nova_compute[229148]: 2025-12-01 10:29:10.436 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.131 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.132 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.132 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.132 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.132 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:29:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:11.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:29:11 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/802762943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.573 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.703 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.704 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4735MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.704 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:11 np0005540826 nova_compute[229148]: 2025-12-01 10:29:11.704 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:11.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.151 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.187 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.187 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.209 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:29:12 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:29:12 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/756928821' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.646 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.653 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.753 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.755 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:29:12 np0005540826 nova_compute[229148]: 2025-12-01 10:29:12.755 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:13.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:15.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:15 np0005540826 nova_compute[229148]: 2025-12-01 10:29:15.439 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:15.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:17 np0005540826 nova_compute[229148]: 2025-12-01 10:29:17.153 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:17.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:17.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:19.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:29:19 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6691 writes, 35K keys, 6691 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6691 writes, 6691 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1556 writes, 8128 keys, 1556 commit groups, 1.0 writes per commit group, ingest: 18.27 MB, 0.03 MB/s#012Interval WAL: 1556 writes, 1556 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    105.9      0.47              0.15        18    0.026       0      0       0.0       0.0#012  L6      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    180.6    155.4      1.44              0.55        17    0.085     94K   9342       0.0       0.0#012 Sum      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5    135.9    143.2      1.91              0.70        35    0.055     94K   9342       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9    108.0    110.1      0.62              0.17         8    0.077     26K   2583       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    180.6    155.4      1.44              0.55        17    0.085     94K   9342       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    106.3      0.47              0.15        17    0.028       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.049, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.27 GB write, 0.11 MB/s write, 0.25 GB read, 0.11 MB/s read, 1.9 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d4317d9350#2 capacity: 304.00 MB usage: 22.57 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000153 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1388,21.84 MB,7.18418%) FilterBlock(35,275.17 KB,0.0883956%) IndexBlock(35,476.58 KB,0.153095%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 05:29:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:19.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:20 np0005540826 nova_compute[229148]: 2025-12-01 10:29:20.443 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:21.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:22 np0005540826 nova_compute[229148]: 2025-12-01 10:29:22.157 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:23.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:23 np0005540826 systemd[1]: session-55.scope: Deactivated successfully.
Dec  1 05:29:23 np0005540826 systemd[1]: session-55.scope: Consumed 2min 50.438s CPU time, 815.2M memory peak, read 348.6M from disk, written 84.5M to disk.
Dec  1 05:29:23 np0005540826 systemd-logind[787]: Session 55 logged out. Waiting for processes to exit.
Dec  1 05:29:23 np0005540826 systemd-logind[787]: Removed session 55.
Dec  1 05:29:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:23.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:23 np0005540826 systemd-logind[787]: New session 56 of user zuul.
Dec  1 05:29:23 np0005540826 systemd[1]: Started Session 56 of User zuul.
Dec  1 05:29:24 np0005540826 systemd[1]: session-56.scope: Deactivated successfully.
Dec  1 05:29:24 np0005540826 systemd-logind[787]: Session 56 logged out. Waiting for processes to exit.
Dec  1 05:29:24 np0005540826 systemd-logind[787]: Removed session 56.
Dec  1 05:29:25 np0005540826 systemd-logind[787]: New session 57 of user zuul.
Dec  1 05:29:25 np0005540826 systemd[1]: Started Session 57 of User zuul.
Dec  1 05:29:25 np0005540826 systemd[1]: session-57.scope: Deactivated successfully.
Dec  1 05:29:25 np0005540826 systemd-logind[787]: Session 57 logged out. Waiting for processes to exit.
Dec  1 05:29:25 np0005540826 systemd-logind[787]: Removed session 57.
Dec  1 05:29:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:25.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:25 np0005540826 nova_compute[229148]: 2025-12-01 10:29:25.458 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:25.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:27 np0005540826 nova_compute[229148]: 2025-12-01 10:29:27.159 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:27.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:27.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:27 np0005540826 podman[251784]: 2025-12-01 10:29:27.976992723 +0000 UTC m=+0.059792567 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:29:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:29.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:30 np0005540826 nova_compute[229148]: 2025-12-01 10:29:30.553 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:31.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:31.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:32 np0005540826 nova_compute[229148]: 2025-12-01 10:29:32.161 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:33.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:33.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:35.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:35 np0005540826 nova_compute[229148]: 2025-12-01 10:29:35.555 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:35.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:37 np0005540826 nova_compute[229148]: 2025-12-01 10:29:37.164 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:37.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:37.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:38 np0005540826 podman[251917]: 2025-12-01 10:29:38.015626016 +0000 UTC m=+0.091131217 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:29:38 np0005540826 podman[251946]: 2025-12-01 10:29:38.9629756 +0000 UTC m=+0.048425725 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  1 05:29:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:39.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:29:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:39.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:29:40 np0005540826 nova_compute[229148]: 2025-12-01 10:29:40.608 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:41.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:41.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:42 np0005540826 nova_compute[229148]: 2025-12-01 10:29:42.167 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:43.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:43.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:45.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:45 np0005540826 nova_compute[229148]: 2025-12-01 10:29:45.609 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:45.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:47 np0005540826 nova_compute[229148]: 2025-12-01 10:29:47.171 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:47.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.449525) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987449559, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2950, "num_deletes": 506, "total_data_size": 6727133, "memory_usage": 6831952, "flush_reason": "Manual Compaction"}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987484010, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4347942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33524, "largest_seqno": 36469, "table_properties": {"data_size": 4335363, "index_size": 7537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3973, "raw_key_size": 33338, "raw_average_key_size": 21, "raw_value_size": 4306800, "raw_average_value_size": 2737, "num_data_blocks": 321, "num_entries": 1573, "num_filter_entries": 1573, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584788, "oldest_key_time": 1764584788, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 34568 microseconds, and 9330 cpu microseconds.
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.484094) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4347942 bytes OK
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.484114) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.485784) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.485797) EVENT_LOG_v1 {"time_micros": 1764584987485793, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.485813) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6712241, prev total WAL file size 6712241, number of live WAL files 2.
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.487743) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4246KB)], [63(13MB)]
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987487809, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18970200, "oldest_snapshot_seqno": -1}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6815 keys, 16744768 bytes, temperature: kUnknown
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987568682, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 16744768, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16697405, "index_size": 29203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17093, "raw_key_size": 175934, "raw_average_key_size": 25, "raw_value_size": 16573135, "raw_average_value_size": 2431, "num_data_blocks": 1172, "num_entries": 6815, "num_filter_entries": 6815, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.568919) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 16744768 bytes
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.570236) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.3 rd, 206.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 13.9 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(8.2) write-amplify(3.9) OK, records in: 7846, records dropped: 1031 output_compression: NoCompression
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.570251) EVENT_LOG_v1 {"time_micros": 1764584987570244, "job": 38, "event": "compaction_finished", "compaction_time_micros": 80954, "compaction_time_cpu_micros": 33014, "output_level": 6, "num_output_files": 1, "total_output_size": 16744768, "num_input_records": 7846, "num_output_records": 6815, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987571222, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987573754, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.487578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.573839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.573846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.573848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.573849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:29:47.573851) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:47.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:49.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:49.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:50 np0005540826 nova_compute[229148]: 2025-12-01 10:29:50.654 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:51.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:52 np0005540826 nova_compute[229148]: 2025-12-01 10:29:52.173 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:53.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:53.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:55 np0005540826 nova_compute[229148]: 2025-12-01 10:29:55.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:55 np0005540826 nova_compute[229148]: 2025-12-01 10:29:55.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:29:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:55.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:55 np0005540826 nova_compute[229148]: 2025-12-01 10:29:55.655 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:56.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:57 np0005540826 nova_compute[229148]: 2025-12-01 10:29:57.175 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:29:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:57.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:58 np0005540826 podman[252027]: 2025-12-01 10:29:58.970374658 +0000 UTC m=+0.056291320 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  1 05:29:59 np0005540826 nova_compute[229148]: 2025-12-01 10:29:59.125 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:29:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:59.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:00 np0005540826 nova_compute[229148]: 2025-12-01 10:30:00.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:00 np0005540826 nova_compute[229148]: 2025-12-01 10:30:00.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:30:00 np0005540826 nova_compute[229148]: 2025-12-01 10:30:00.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:30:00 np0005540826 nova_compute[229148]: 2025-12-01 10:30:00.149 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:30:00 np0005540826 ceph-mon[80026]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Dec  1 05:30:00 np0005540826 ceph-mon[80026]:    daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1 is in error state
Dec  1 05:30:00 np0005540826 ceph-mon[80026]:    daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2 is in error state
Dec  1 05:30:00 np0005540826 nova_compute[229148]: 2025-12-01 10:30:00.658 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.735134) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000735191, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 378, "num_deletes": 251, "total_data_size": 404124, "memory_usage": 412400, "flush_reason": "Manual Compaction"}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000738398, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 247709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36474, "largest_seqno": 36847, "table_properties": {"data_size": 245482, "index_size": 391, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6170, "raw_average_key_size": 20, "raw_value_size": 240987, "raw_average_value_size": 797, "num_data_blocks": 17, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584988, "oldest_key_time": 1764584988, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 3292 microseconds, and 1466 cpu microseconds.
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.738433) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 247709 bytes OK
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.738450) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.739652) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.739663) EVENT_LOG_v1 {"time_micros": 1764585000739659, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.739677) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 401599, prev total WAL file size 401599, number of live WAL files 2.
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.740081) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(241KB)], [66(15MB)]
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000740108, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16992477, "oldest_snapshot_seqno": -1}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6607 keys, 12891638 bytes, temperature: kUnknown
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000803242, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12891638, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12850428, "index_size": 23562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 171840, "raw_average_key_size": 26, "raw_value_size": 12734556, "raw_average_value_size": 1927, "num_data_blocks": 937, "num_entries": 6607, "num_filter_entries": 6607, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.803451) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12891638 bytes
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.805312) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 268.9 rd, 204.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.0 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(120.6) write-amplify(52.0) OK, records in: 7117, records dropped: 510 output_compression: NoCompression
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.805328) EVENT_LOG_v1 {"time_micros": 1764585000805320, "job": 40, "event": "compaction_finished", "compaction_time_micros": 63199, "compaction_time_cpu_micros": 26679, "output_level": 6, "num_output_files": 1, "total_output_size": 12891638, "num_input_records": 7117, "num_output_records": 6607, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000805461, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000808807, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.739978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.808928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.808934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.808936) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.808938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:00.808940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:01 np0005540826 nova_compute[229148]: 2025-12-01 10:30:01.144 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:01.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:01 np0005540826 nova_compute[229148]: 2025-12-01 10:30:01.415 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:02.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:02 np0005540826 nova_compute[229148]: 2025-12-01 10:30:02.178 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:03 np0005540826 nova_compute[229148]: 2025-12-01 10:30:03.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:03 np0005540826 nova_compute[229148]: 2025-12-01 10:30:03.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:30:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:03.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:04.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:04 np0005540826 nova_compute[229148]: 2025-12-01 10:30:04.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:04 np0005540826 nova_compute[229148]: 2025-12-01 10:30:04.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:30:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:30:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:30:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:05 np0005540826 nova_compute[229148]: 2025-12-01 10:30:05.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:05.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:05 np0005540826 nova_compute[229148]: 2025-12-01 10:30:05.660 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:06.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:06 np0005540826 nova_compute[229148]: 2025-12-01 10:30:06.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:07 np0005540826 nova_compute[229148]: 2025-12-01 10:30:07.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:07 np0005540826 nova_compute[229148]: 2025-12-01 10:30:07.217 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:07.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:08.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:09 np0005540826 podman[252061]: 2025-12-01 10:30:09.019255448 +0000 UTC m=+0.089751373 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:30:09 np0005540826 podman[252104]: 2025-12-01 10:30:09.113371608 +0000 UTC m=+0.060028774 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  1 05:30:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:30:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:09.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:30:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:10.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:10 np0005540826 nova_compute[229148]: 2025-12-01 10:30:10.661 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:11.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:12.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:12 np0005540826 nova_compute[229148]: 2025-12-01 10:30:12.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:12 np0005540826 nova_compute[229148]: 2025-12-01 10:30:12.220 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:13.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.014 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.014 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.015 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.015 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.015 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:30:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:14.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:30:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1438712320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.451 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.584 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.585 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4874MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.585 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.586 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.821 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.821 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:30:14 np0005540826 nova_compute[229148]: 2025-12-01 10:30:14.948 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:30:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:30:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:15.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:30:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:30:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1105278184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.380 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.385 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.405 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.406 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.407 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.407 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.407 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.436 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:30:15 np0005540826 nova_compute[229148]: 2025-12-01 10:30:15.662 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:16.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:17 np0005540826 nova_compute[229148]: 2025-12-01 10:30:17.223 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:17.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:18.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:19 np0005540826 nova_compute[229148]: 2025-12-01 10:30:19.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:19.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:20.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:20 np0005540826 nova_compute[229148]: 2025-12-01 10:30:20.664 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:21.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:22.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:22 np0005540826 nova_compute[229148]: 2025-12-01 10:30:22.225 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:23.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:24.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:25.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:25 np0005540826 nova_compute[229148]: 2025-12-01 10:30:25.698 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:26.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:27 np0005540826 nova_compute[229148]: 2025-12-01 10:30:27.229 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:27.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:28.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:29 np0005540826 podman[252202]: 2025-12-01 10:30:29.099856804 +0000 UTC m=+0.052998288 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  1 05:30:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:29.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:30.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:30 np0005540826 nova_compute[229148]: 2025-12-01 10:30:30.701 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:31.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:32.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:32 np0005540826 nova_compute[229148]: 2025-12-01 10:30:32.231 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:30:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:34.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:30:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:35.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:35 np0005540826 nova_compute[229148]: 2025-12-01 10:30:35.704 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:36.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:37 np0005540826 nova_compute[229148]: 2025-12-01 10:30:37.233 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:37.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:38.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:39 np0005540826 podman[252229]: 2025-12-01 10:30:39.966841638 +0000 UTC m=+0.050715417 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:30:40 np0005540826 podman[252230]: 2025-12-01 10:30:40.001013206 +0000 UTC m=+0.076499723 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 05:30:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:40.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:40 np0005540826 nova_compute[229148]: 2025-12-01 10:30:40.706 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:41.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:42.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:42 np0005540826 nova_compute[229148]: 2025-12-01 10:30:42.236 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:43.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:44.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:45.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:45 np0005540826 nova_compute[229148]: 2025-12-01 10:30:45.708 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.858813) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045858858, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 676, "num_deletes": 250, "total_data_size": 1201741, "memory_usage": 1219376, "flush_reason": "Manual Compaction"}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045867239, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 786181, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36852, "largest_seqno": 37523, "table_properties": {"data_size": 782939, "index_size": 1150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6668, "raw_average_key_size": 16, "raw_value_size": 776411, "raw_average_value_size": 1950, "num_data_blocks": 51, "num_entries": 398, "num_filter_entries": 398, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585001, "oldest_key_time": 1764585001, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 8490 microseconds, and 3434 cpu microseconds.
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867294) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 786181 bytes OK
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867327) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.868859) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.868881) EVENT_LOG_v1 {"time_micros": 1764585045868874, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.868905) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1198067, prev total WAL file size 1198067, number of live WAL files 2.
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.869717) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(767KB)], [69(12MB)]
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045869859, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13677819, "oldest_snapshot_seqno": -1}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6494 keys, 12314247 bytes, temperature: kUnknown
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045934592, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12314247, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12273955, "index_size": 22966, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 171167, "raw_average_key_size": 26, "raw_value_size": 12159891, "raw_average_value_size": 1872, "num_data_blocks": 899, "num_entries": 6494, "num_filter_entries": 6494, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.935159) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12314247 bytes
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.936739) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.4 rd, 189.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.3 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(33.1) write-amplify(15.7) OK, records in: 7005, records dropped: 511 output_compression: NoCompression
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.936768) EVENT_LOG_v1 {"time_micros": 1764585045936755, "job": 42, "event": "compaction_finished", "compaction_time_micros": 65013, "compaction_time_cpu_micros": 32198, "output_level": 6, "num_output_files": 1, "total_output_size": 12314247, "num_input_records": 7005, "num_output_records": 6494, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045937644, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045941555, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.869521) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.941692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.941699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.941701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.941702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:30:45.941704) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:46.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:30:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:46 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:30:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:47 np0005540826 nova_compute[229148]: 2025-12-01 10:30:47.239 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:47.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:48.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:49.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:50.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:50 np0005540826 nova_compute[229148]: 2025-12-01 10:30:50.710 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:30:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:51.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:30:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:30:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:52.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:30:52 np0005540826 nova_compute[229148]: 2025-12-01 10:30:52.243 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:53 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:53 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:30:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:30:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:55 np0005540826 nova_compute[229148]: 2025-12-01 10:30:55.711 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:56.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:57 np0005540826 nova_compute[229148]: 2025-12-01 10:30:57.245 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:30:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:57.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:57 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  1 05:30:57 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  1 05:30:57 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:58.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540826 radosgw[83613]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Dec  1 05:30:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:30:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:59.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:00 np0005540826 podman[252415]: 2025-12-01 10:31:00.012724649 +0000 UTC m=+0.085726237 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  1 05:31:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:00.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:00 np0005540826 nova_compute[229148]: 2025-12-01 10:31:00.711 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:01 np0005540826 nova_compute[229148]: 2025-12-01 10:31:01.153 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:01.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:02.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:02 np0005540826 nova_compute[229148]: 2025-12-01 10:31:02.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:02 np0005540826 nova_compute[229148]: 2025-12-01 10:31:02.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:31:02 np0005540826 nova_compute[229148]: 2025-12-01 10:31:02.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:31:02 np0005540826 nova_compute[229148]: 2025-12-01 10:31:02.246 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:02 np0005540826 nova_compute[229148]: 2025-12-01 10:31:02.733 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:31:03 np0005540826 nova_compute[229148]: 2025-12-01 10:31:03.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:03 np0005540826 nova_compute[229148]: 2025-12-01 10:31:03.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:31:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:03.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:04 np0005540826 nova_compute[229148]: 2025-12-01 10:31:04.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:04 np0005540826 nova_compute[229148]: 2025-12-01 10:31:04.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:31:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:31:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:31:04.564 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:05.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:05 np0005540826 nova_compute[229148]: 2025-12-01 10:31:05.750 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:06.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:06 np0005540826 nova_compute[229148]: 2025-12-01 10:31:06.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:07 np0005540826 nova_compute[229148]: 2025-12-01 10:31:07.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:07 np0005540826 nova_compute[229148]: 2025-12-01 10:31:07.249 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:07.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:08.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:08 np0005540826 nova_compute[229148]: 2025-12-01 10:31:08.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:09.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:10.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:10 np0005540826 nova_compute[229148]: 2025-12-01 10:31:10.750 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:10 np0005540826 podman[252466]: 2025-12-01 10:31:10.978004211 +0000 UTC m=+0.053035959 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:31:11 np0005540826 podman[252467]: 2025-12-01 10:31:11.010976437 +0000 UTC m=+0.079500572 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec  1 05:31:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:11.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:12.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:12 np0005540826 nova_compute[229148]: 2025-12-01 10:31:12.251 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:13.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:14.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.135 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.136 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.136 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.136 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.136 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:31:14 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:31:14 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3390945076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.569 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.736 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.738 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4889MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.738 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.738 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.876 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.877 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:31:14 np0005540826 nova_compute[229148]: 2025-12-01 10:31:14.915 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:31:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:31:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1827298230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.350 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.355 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.378 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.379 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.380 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:15.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:15 np0005540826 nova_compute[229148]: 2025-12-01 10:31:15.752 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:16.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:17 np0005540826 nova_compute[229148]: 2025-12-01 10:31:17.254 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:17.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:18.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:19.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:20.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:20 np0005540826 nova_compute[229148]: 2025-12-01 10:31:20.753 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:21.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:22.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:22 np0005540826 nova_compute[229148]: 2025-12-01 10:31:22.257 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:23.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:24.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:25.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:25 np0005540826 nova_compute[229148]: 2025-12-01 10:31:25.818 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:26.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:27 np0005540826 nova_compute[229148]: 2025-12-01 10:31:27.260 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:27.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:28.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:29.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:30 np0005540826 nova_compute[229148]: 2025-12-01 10:31:30.821 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:30 np0005540826 podman[252586]: 2025-12-01 10:31:30.985440422 +0000 UTC m=+0.069927988 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:31:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:31.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:32.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:32 np0005540826 nova_compute[229148]: 2025-12-01 10:31:32.263 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:33.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:34.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:35.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:35 np0005540826 nova_compute[229148]: 2025-12-01 10:31:35.821 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:36.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:37 np0005540826 nova_compute[229148]: 2025-12-01 10:31:37.265 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:37.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:38.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:39.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:40.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:40 np0005540826 nova_compute[229148]: 2025-12-01 10:31:40.823 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:41.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:41 np0005540826 podman[252611]: 2025-12-01 10:31:41.962931798 +0000 UTC m=+0.049155306 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:31:41 np0005540826 podman[252612]: 2025-12-01 10:31:41.998288497 +0000 UTC m=+0.083698103 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:31:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:42 np0005540826 nova_compute[229148]: 2025-12-01 10:31:42.267 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:43.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:44.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:45 np0005540826 nova_compute[229148]: 2025-12-01 10:31:45.825 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:46.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:47 np0005540826 nova_compute[229148]: 2025-12-01 10:31:47.275 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:47.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:48.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:49.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:50.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:50 np0005540826 nova_compute[229148]: 2025-12-01 10:31:50.857 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:51.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:52.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:52 np0005540826 nova_compute[229148]: 2025-12-01 10:31:52.277 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:53.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:54.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:31:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:31:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:31:54 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:31:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:31:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:55.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:31:55 np0005540826 nova_compute[229148]: 2025-12-01 10:31:55.859 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:56.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:57 np0005540826 nova_compute[229148]: 2025-12-01 10:31:57.280 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:31:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:31:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:57.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:31:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:58.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:31:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:59.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:00.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:00 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:32:00 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:32:00 np0005540826 nova_compute[229148]: 2025-12-01 10:32:00.862 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:01.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:01 np0005540826 podman[252796]: 2025-12-01 10:32:01.978025043 +0000 UTC m=+0.061202816 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2)
Dec  1 05:32:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:02.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:02 np0005540826 nova_compute[229148]: 2025-12-01 10:32:02.282 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.379 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:03.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.678 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.678 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.678 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.706 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:32:03 np0005540826 nova_compute[229148]: 2025-12-01 10:32:03.706 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:04 np0005540826 nova_compute[229148]: 2025-12-01 10:32:04.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:04 np0005540826 nova_compute[229148]: 2025-12-01 10:32:04.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:32:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:04.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:04.565 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:04.565 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:04.566 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:05 np0005540826 nova_compute[229148]: 2025-12-01 10:32:05.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:05.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:05 np0005540826 nova_compute[229148]: 2025-12-01 10:32:05.865 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:06 np0005540826 nova_compute[229148]: 2025-12-01 10:32:06.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:06.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:07 np0005540826 nova_compute[229148]: 2025-12-01 10:32:07.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:07 np0005540826 nova_compute[229148]: 2025-12-01 10:32:07.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:32:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/137364324' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:32:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:32:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/137364324' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:32:07 np0005540826 nova_compute[229148]: 2025-12-01 10:32:07.284 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:07.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:08 np0005540826 nova_compute[229148]: 2025-12-01 10:32:08.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:08.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:09.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:10.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:10 np0005540826 nova_compute[229148]: 2025-12-01 10:32:10.865 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:11.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:12.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:12 np0005540826 nova_compute[229148]: 2025-12-01 10:32:12.286 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:12 np0005540826 podman[252847]: 2025-12-01 10:32:12.980724947 +0000 UTC m=+0.058091843 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:32:13 np0005540826 podman[252848]: 2025-12-01 10:32:13.023985306 +0000 UTC m=+0.097060118 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:32:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:13.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:14.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.201 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.201 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.202 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.202 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.202 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 05:32:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:15.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 05:32:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:32:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/568064490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.659 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.821 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.822 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4857MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.822 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.823 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:15 np0005540826 nova_compute[229148]: 2025-12-01 10:32:15.867 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.172 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.172 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:32:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:16.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.189 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.215 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.215 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.230 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.252 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.272 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:32:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2213763123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:32:16 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:32:16 np0005540826 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.704 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.709 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.731 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.733 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:32:16 np0005540826 nova_compute[229148]: 2025-12-01 10:32:16.733 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:17 np0005540826 nova_compute[229148]: 2025-12-01 10:32:17.288 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:17.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:18.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:19.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:20 np0005540826 nova_compute[229148]: 2025-12-01 10:32:20.907 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:21.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:22.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:22 np0005540826 nova_compute[229148]: 2025-12-01 10:32:22.291 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:23.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:24.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:25.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:25 np0005540826 nova_compute[229148]: 2025-12-01 10:32:25.909 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:26.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:27 np0005540826 nova_compute[229148]: 2025-12-01 10:32:27.294 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:27.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:28.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:29.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:30.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:30 np0005540826 nova_compute[229148]: 2025-12-01 10:32:30.911 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:31.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:32.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:32 np0005540826 nova_compute[229148]: 2025-12-01 10:32:32.296 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:33 np0005540826 podman[252971]: 2025-12-01 10:32:33.002586971 +0000 UTC m=+0.081628437 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  1 05:32:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:33.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:34.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:35.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:35 np0005540826 nova_compute[229148]: 2025-12-01 10:32:35.913 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:36.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:37 np0005540826 nova_compute[229148]: 2025-12-01 10:32:37.337 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:37.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:38.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:39.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:40.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:40 np0005540826 nova_compute[229148]: 2025-12-01 10:32:40.948 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:41.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:42.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:42 np0005540826 nova_compute[229148]: 2025-12-01 10:32:42.340 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:43.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:43 np0005540826 podman[252996]: 2025-12-01 10:32:43.977735612 +0000 UTC m=+0.060269174 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:32:44 np0005540826 nova_compute[229148]: 2025-12-01 10:32:44.026 229152 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:44 np0005540826 podman[252997]: 2025-12-01 10:32:44.032094828 +0000 UTC m=+0.102374184 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:32:44 np0005540826 nova_compute[229148]: 2025-12-01 10:32:44.062 229152 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:32:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:44.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:32:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:45.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:45 np0005540826 nova_compute[229148]: 2025-12-01 10:32:45.949 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:46.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:47 np0005540826 nova_compute[229148]: 2025-12-01 10:32:47.344 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:47.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:48.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:49.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:32:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:50.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:32:50 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:50.520 141685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:32:50 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:50.521 141685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:32:50 np0005540826 nova_compute[229148]: 2025-12-01 10:32:50.522 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:50 np0005540826 nova_compute[229148]: 2025-12-01 10:32:50.951 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:51.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:52.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:52 np0005540826 nova_compute[229148]: 2025-12-01 10:32:52.345 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:54.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:55 np0005540826 nova_compute[229148]: 2025-12-01 10:32:55.968 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:32:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:56.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:32:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:57 np0005540826 nova_compute[229148]: 2025-12-01 10:32:57.403 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:32:57 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:32:57.523 141685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b99910e3-15ec-4cc7-b887-f5229f22d165, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:32:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:57.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:32:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:59.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:00.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:00 np0005540826 nova_compute[229148]: 2025-12-01 10:33:00.971 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:01.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:33:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:01 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:33:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:02.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:02 np0005540826 nova_compute[229148]: 2025-12-01 10:33:02.405 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:03 np0005540826 podman[253159]: 2025-12-01 10:33:03.238387194 +0000 UTC m=+0.073364451 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:33:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:03.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:03 np0005540826 nova_compute[229148]: 2025-12-01 10:33:03.734 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:03 np0005540826 nova_compute[229148]: 2025-12-01 10:33:03.734 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:33:03 np0005540826 nova_compute[229148]: 2025-12-01 10:33:03.734 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:33:03 np0005540826 nova_compute[229148]: 2025-12-01 10:33:03.756 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:33:03 np0005540826 nova_compute[229148]: 2025-12-01 10:33:03.757 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:04.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:33:04.565 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:33:04.566 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:33:04.566 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:05.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:05 np0005540826 nova_compute[229148]: 2025-12-01 10:33:05.973 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:06 np0005540826 nova_compute[229148]: 2025-12-01 10:33:06.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:06 np0005540826 nova_compute[229148]: 2025-12-01 10:33:06.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:33:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:06.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:07 np0005540826 nova_compute[229148]: 2025-12-01 10:33:07.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:07 np0005540826 nova_compute[229148]: 2025-12-01 10:33:07.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:07 np0005540826 nova_compute[229148]: 2025-12-01 10:33:07.407 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:07.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:08 np0005540826 nova_compute[229148]: 2025-12-01 10:33:08.106 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:08 np0005540826 nova_compute[229148]: 2025-12-01 10:33:08.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:09.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:10 np0005540826 nova_compute[229148]: 2025-12-01 10:33:10.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:10.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:10 np0005540826 nova_compute[229148]: 2025-12-01 10:33:10.978 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:11.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:12 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:12.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:12 np0005540826 nova_compute[229148]: 2025-12-01 10:33:12.409 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:13.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:14.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:14 np0005540826 podman[253235]: 2025-12-01 10:33:14.99851151 +0000 UTC m=+0.070528270 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:33:15 np0005540826 podman[253236]: 2025-12-01 10:33:15.023555555 +0000 UTC m=+0.090773055 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.126 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.127 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.127 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.127 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.127 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:33:15 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:33:15 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3859021103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.613 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:33:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:15.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.824 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.826 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4872MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.826 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.826 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.897 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.898 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.926 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:33:15 np0005540826 nova_compute[229148]: 2025-12-01 10:33:15.981 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:16.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:33:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3811083457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:33:16 np0005540826 nova_compute[229148]: 2025-12-01 10:33:16.418 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:33:16 np0005540826 nova_compute[229148]: 2025-12-01 10:33:16.427 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:33:16 np0005540826 nova_compute[229148]: 2025-12-01 10:33:16.444 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:33:16 np0005540826 nova_compute[229148]: 2025-12-01 10:33:16.447 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:33:16 np0005540826 nova_compute[229148]: 2025-12-01 10:33:16.448 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:17 np0005540826 nova_compute[229148]: 2025-12-01 10:33:17.412 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:17.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:18.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:19.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:20.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:20 np0005540826 nova_compute[229148]: 2025-12-01 10:33:20.984 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:21.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:22.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:22 np0005540826 nova_compute[229148]: 2025-12-01 10:33:22.415 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:23.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:24.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:25.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:25 np0005540826 nova_compute[229148]: 2025-12-01 10:33:25.986 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:26.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:27 np0005540826 nova_compute[229148]: 2025-12-01 10:33:27.462 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:27.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:28.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:29.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:30.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.635735) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210635759, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1799, "num_deletes": 251, "total_data_size": 4857382, "memory_usage": 4930672, "flush_reason": "Manual Compaction"}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210650979, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3141933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37528, "largest_seqno": 39322, "table_properties": {"data_size": 3134277, "index_size": 4599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15807, "raw_average_key_size": 20, "raw_value_size": 3119073, "raw_average_value_size": 3993, "num_data_blocks": 200, "num_entries": 781, "num_filter_entries": 781, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585046, "oldest_key_time": 1764585046, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 15297 microseconds, and 5932 cpu microseconds.
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.651029) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3141933 bytes OK
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.651049) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.652292) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.652303) EVENT_LOG_v1 {"time_micros": 1764585210652300, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.652320) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4849146, prev total WAL file size 4849146, number of live WAL files 2.
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.653399) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3068KB)], [72(11MB)]
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210653448, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15456180, "oldest_snapshot_seqno": -1}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6759 keys, 13343102 bytes, temperature: kUnknown
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210712298, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13343102, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13300148, "index_size": 24922, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 177462, "raw_average_key_size": 26, "raw_value_size": 13180605, "raw_average_value_size": 1950, "num_data_blocks": 980, "num_entries": 6759, "num_filter_entries": 6759, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.712497) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13343102 bytes
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.713740) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 262.4 rd, 226.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 11.7 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(9.2) write-amplify(4.2) OK, records in: 7275, records dropped: 516 output_compression: NoCompression
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.713755) EVENT_LOG_v1 {"time_micros": 1764585210713748, "job": 44, "event": "compaction_finished", "compaction_time_micros": 58910, "compaction_time_cpu_micros": 25972, "output_level": 6, "num_output_files": 1, "total_output_size": 13343102, "num_input_records": 7275, "num_output_records": 6759, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210714357, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210716398, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.653330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.716467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.716472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.716473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.716475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:33:30.716476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540826 nova_compute[229148]: 2025-12-01 10:33:30.987 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:31.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:32.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:32 np0005540826 nova_compute[229148]: 2025-12-01 10:33:32.464 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:33.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:34 np0005540826 podman[253360]: 2025-12-01 10:33:34.000799366 +0000 UTC m=+0.074759276 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3)
Dec  1 05:33:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:34.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:35.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:35 np0005540826 nova_compute[229148]: 2025-12-01 10:33:35.990 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:37 np0005540826 nova_compute[229148]: 2025-12-01 10:33:37.467 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:37.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:39.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:40 np0005540826 nova_compute[229148]: 2025-12-01 10:33:40.992 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:41.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:42.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:42 np0005540826 nova_compute[229148]: 2025-12-01 10:33:42.470 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:43.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:44.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:45.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:45 np0005540826 nova_compute[229148]: 2025-12-01 10:33:45.993 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:46 np0005540826 podman[253386]: 2025-12-01 10:33:46.0230029 +0000 UTC m=+0.093946444 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:33:46 np0005540826 podman[253387]: 2025-12-01 10:33:46.030971909 +0000 UTC m=+0.103339849 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:33:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:46.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:47 np0005540826 nova_compute[229148]: 2025-12-01 10:33:47.474 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:47.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:48.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:49.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:50.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:50 np0005540826 nova_compute[229148]: 2025-12-01 10:33:50.997 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:51.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:52.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:52 np0005540826 nova_compute[229148]: 2025-12-01 10:33:52.475 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:53.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:55.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:55 np0005540826 nova_compute[229148]: 2025-12-01 10:33:55.997 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:56.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:57 np0005540826 nova_compute[229148]: 2025-12-01 10:33:57.478 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:33:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:57.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:33:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:58.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:33:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:33:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:59.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:00.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:01 np0005540826 nova_compute[229148]: 2025-12-01 10:34:01.000 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:01.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:02.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:02 np0005540826 nova_compute[229148]: 2025-12-01 10:34:02.479 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:03.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:04.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:04 np0005540826 nova_compute[229148]: 2025-12-01 10:34:04.449 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:04 np0005540826 nova_compute[229148]: 2025-12-01 10:34:04.449 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:34:04 np0005540826 nova_compute[229148]: 2025-12-01 10:34:04.449 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:34:04 np0005540826 nova_compute[229148]: 2025-12-01 10:34:04.460 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:34:04 np0005540826 nova_compute[229148]: 2025-12-01 10:34:04.461 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:34:04.566 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:34:04.567 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:34:04.567 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:04 np0005540826 podman[253465]: 2025-12-01 10:34:04.981441158 +0000 UTC m=+0.068497889 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:34:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:05.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:06 np0005540826 nova_compute[229148]: 2025-12-01 10:34:06.002 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:06 np0005540826 nova_compute[229148]: 2025-12-01 10:34:06.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:06 np0005540826 nova_compute[229148]: 2025-12-01 10:34:06.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:34:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:06.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:07 np0005540826 nova_compute[229148]: 2025-12-01 10:34:07.105 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:07 np0005540826 nova_compute[229148]: 2025-12-01 10:34:07.122 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:07 np0005540826 nova_compute[229148]: 2025-12-01 10:34:07.482 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:07.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:08 np0005540826 nova_compute[229148]: 2025-12-01 10:34:08.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:08 np0005540826 nova_compute[229148]: 2025-12-01 10:34:08.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:08 np0005540826 nova_compute[229148]: 2025-12-01 10:34:08.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:08.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:09 np0005540826 podman[253612]: 2025-12-01 10:34:09.243866313 +0000 UTC m=+0.064542001 container exec 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:34:09 np0005540826 podman[253612]: 2025-12-01 10:34:09.347525428 +0000 UTC m=+0.168201076 container exec_died 7c618b1a07db29948a07bf15abaac0e58a16d635b553771e228aad7ce3c0be89 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec  1 05:34:09 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:09.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:09 np0005540826 podman[253732]: 2025-12-01 10:34:09.755462232 +0000 UTC m=+0.046501030 container exec b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:34:09 np0005540826 podman[253732]: 2025-12-01 10:34:09.768375454 +0000 UTC m=+0.059414222 container exec_died b5cba524f9aa4c6a3dd34b5465b21a05089c422db1ad8a4bb4113778960afe0c (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:34:10 np0005540826 podman[253893]: 2025-12-01 10:34:10.294195098 +0000 UTC m=+0.054222803 container exec 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:34:10 np0005540826 podman[253893]: 2025-12-01 10:34:10.311416918 +0000 UTC m=+0.071444613 container exec_died 5d28e05f02e2faaaecccbb224265ce968eb74994512ada8525a0c42a70019a9e (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-1-pwynis)
Dec  1 05:34:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:10.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:10 np0005540826 podman[253959]: 2025-12-01 10:34:10.49913373 +0000 UTC m=+0.046268525 container exec b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, com.redhat.component=keepalived-container, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, release=1793, version=2.2.4, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  1 05:34:10 np0005540826 podman[253959]: 2025-12-01 10:34:10.515312543 +0000 UTC m=+0.062447318 container exec_died b99b1792fbad48bc30fedd676b925d405af73aba1041bee68b375eadfb2319cf (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-1-wzwqmm, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.openshift.expose-services=, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, description=keepalived for Ceph, name=keepalived, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  1 05:34:11 np0005540826 nova_compute[229148]: 2025-12-01 10:34:11.003 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:11 np0005540826 nova_compute[229148]: 2025-12-01 10:34:11.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:34:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:11.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:12.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:12 np0005540826 nova_compute[229148]: 2025-12-01 10:34:12.519 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:13.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:14.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:15.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.006 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.139 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.139 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.140 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.140 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.141 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:34:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:16.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:34:16 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/938820539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.597 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:34:16 np0005540826 podman[254120]: 2025-12-01 10:34:16.660065661 +0000 UTC m=+0.066452438 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:34:16 np0005540826 podman[254123]: 2025-12-01 10:34:16.704921109 +0000 UTC m=+0.111376748 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.782 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.783 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4845MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.783 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.784 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.849 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.849 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:34:16 np0005540826 nova_compute[229148]: 2025-12-01 10:34:16.863 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:34:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:17 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:34:17 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/784333000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.305 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.311 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.325 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.327 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.327 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:17 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:17 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:17 np0005540826 nova_compute[229148]: 2025-12-01 10:34:17.520 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:17.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:18.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:20.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:21 np0005540826 nova_compute[229148]: 2025-12-01 10:34:21.010 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:21.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:22.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:22 np0005540826 nova_compute[229148]: 2025-12-01 10:34:22.522 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:23.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:24.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:25.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:26 np0005540826 nova_compute[229148]: 2025-12-01 10:34:26.012 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:26.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:27 np0005540826 nova_compute[229148]: 2025-12-01 10:34:27.571 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:27.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:28.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:29.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:30.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:31 np0005540826 nova_compute[229148]: 2025-12-01 10:34:31.015 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:31.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:32.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:32 np0005540826 nova_compute[229148]: 2025-12-01 10:34:32.575 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:33.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:34.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:35.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:35 np0005540826 podman[254224]: 2025-12-01 10:34:35.988488121 +0000 UTC m=+0.077116019 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  1 05:34:36 np0005540826 nova_compute[229148]: 2025-12-01 10:34:36.016 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:36.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:37 np0005540826 nova_compute[229148]: 2025-12-01 10:34:37.577 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:37.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:38.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:39.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:40.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:41 np0005540826 nova_compute[229148]: 2025-12-01 10:34:41.019 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:41.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:42.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:42 np0005540826 nova_compute[229148]: 2025-12-01 10:34:42.580 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:43.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:44.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:46 np0005540826 nova_compute[229148]: 2025-12-01 10:34:46.022 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:46.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:46 np0005540826 podman[254252]: 2025-12-01 10:34:46.976969629 +0000 UTC m=+0.055599318 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:34:47 np0005540826 podman[254253]: 2025-12-01 10:34:47.01163408 +0000 UTC m=+0.085578382 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 05:34:47 np0005540826 nova_compute[229148]: 2025-12-01 10:34:47.583 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:48.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:49.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:34:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:50.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:34:51 np0005540826 nova_compute[229148]: 2025-12-01 10:34:51.025 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:51.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:52.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:52 np0005540826 nova_compute[229148]: 2025-12-01 10:34:52.585 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:53.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:54.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:55 np0005540826 nova_compute[229148]: 2025-12-01 10:34:55.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:55 np0005540826 nova_compute[229148]: 2025-12-01 10:34:55.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:34:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:55.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.897528) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295897588, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1096, "num_deletes": 255, "total_data_size": 2624081, "memory_usage": 2676544, "flush_reason": "Manual Compaction"}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295908578, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1718985, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39328, "largest_seqno": 40418, "table_properties": {"data_size": 1713999, "index_size": 2510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10753, "raw_average_key_size": 19, "raw_value_size": 1703981, "raw_average_value_size": 3109, "num_data_blocks": 108, "num_entries": 548, "num_filter_entries": 548, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585211, "oldest_key_time": 1764585211, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 11100 microseconds, and 5356 cpu microseconds.
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.908636) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1718985 bytes OK
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.908653) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.910218) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.910237) EVENT_LOG_v1 {"time_micros": 1764585295910231, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.910258) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2618686, prev total WAL file size 2618686, number of live WAL files 2.
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911228) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1678KB)], [75(12MB)]
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295911274, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15062087, "oldest_snapshot_seqno": -1}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6779 keys, 14900420 bytes, temperature: kUnknown
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295985264, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14900420, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14855584, "index_size": 26804, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 178804, "raw_average_key_size": 26, "raw_value_size": 14733858, "raw_average_value_size": 2173, "num_data_blocks": 1057, "num_entries": 6779, "num_filter_entries": 6779, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.985629) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14900420 bytes
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.987510) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.2 rd, 201.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.7 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(17.4) write-amplify(8.7) OK, records in: 7307, records dropped: 528 output_compression: NoCompression
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.987546) EVENT_LOG_v1 {"time_micros": 1764585295987530, "job": 46, "event": "compaction_finished", "compaction_time_micros": 74115, "compaction_time_cpu_micros": 39045, "output_level": 6, "num_output_files": 1, "total_output_size": 14900420, "num_input_records": 7307, "num_output_records": 6779, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295988535, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295993495, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.993799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.993812) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.993824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.994142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:55 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:34:55.994148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540826 nova_compute[229148]: 2025-12-01 10:34:56.027 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:56.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:57 np0005540826 nova_compute[229148]: 2025-12-01 10:34:57.587 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:34:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:57.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:58.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:34:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:59.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:00.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:01 np0005540826 nova_compute[229148]: 2025-12-01 10:35:01.043 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:01.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:02.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:02 np0005540826 nova_compute[229148]: 2025-12-01 10:35:02.590 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:04 np0005540826 nova_compute[229148]: 2025-12-01 10:35:04.129 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:04.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:35:04.567 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:35:04.567 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:35:04.567 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:05 np0005540826 nova_compute[229148]: 2025-12-01 10:35:05.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:05 np0005540826 nova_compute[229148]: 2025-12-01 10:35:05.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:35:05 np0005540826 nova_compute[229148]: 2025-12-01 10:35:05.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:35:05 np0005540826 nova_compute[229148]: 2025-12-01 10:35:05.125 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:35:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:06 np0005540826 nova_compute[229148]: 2025-12-01 10:35:06.045 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:06 np0005540826 nova_compute[229148]: 2025-12-01 10:35:06.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:06 np0005540826 nova_compute[229148]: 2025-12-01 10:35:06.110 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:35:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:06.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:07 np0005540826 podman[254331]: 2025-12-01 10:35:07.016817193 +0000 UTC m=+0.094304251 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:35:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:35:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/530935708' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:35:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:35:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/530935708' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:35:07 np0005540826 nova_compute[229148]: 2025-12-01 10:35:07.593 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:07.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:08 np0005540826 nova_compute[229148]: 2025-12-01 10:35:08.106 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:08 np0005540826 nova_compute[229148]: 2025-12-01 10:35:08.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:08.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:09 np0005540826 nova_compute[229148]: 2025-12-01 10:35:09.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:09 np0005540826 nova_compute[229148]: 2025-12-01 10:35:09.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:09.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:10.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:11 np0005540826 nova_compute[229148]: 2025-12-01 10:35:11.046 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:11.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:12.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:12 np0005540826 nova_compute[229148]: 2025-12-01 10:35:12.596 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:13 np0005540826 nova_compute[229148]: 2025-12-01 10:35:13.111 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:13.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:14.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:15.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:16 np0005540826 nova_compute[229148]: 2025-12-01 10:35:16.048 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:16 np0005540826 nova_compute[229148]: 2025-12-01 10:35:16.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:16.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.599 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.704 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.705 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.705 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.705 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:35:17 np0005540826 nova_compute[229148]: 2025-12-01 10:35:17.706 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:35:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:17.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:17 np0005540826 podman[254480]: 2025-12-01 10:35:17.976932796 +0000 UTC m=+0.056730017 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  1 05:35:18 np0005540826 podman[254481]: 2025-12-01 10:35:18.029049325 +0000 UTC m=+0.104976869 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:35:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:35:18 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3129889998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.166 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.315 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.317 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4876MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.317 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.318 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.398 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.398 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:35:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:18.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.455 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:35:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:35:18 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1569583609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.913 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:35:18 np0005540826 nova_compute[229148]: 2025-12-01 10:35:18.919 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:35:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:35:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:19 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.018 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.020 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.020 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.021 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.022 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:35:19 np0005540826 nova_compute[229148]: 2025-12-01 10:35:19.039 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:35:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:35:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:19.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:35:20 np0005540826 nova_compute[229148]: 2025-12-01 10:35:20.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:20.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:21 np0005540826 nova_compute[229148]: 2025-12-01 10:35:21.049 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:21.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:22.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:22 np0005540826 nova_compute[229148]: 2025-12-01 10:35:22.601 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:23.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:24.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:25.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:26 np0005540826 nova_compute[229148]: 2025-12-01 10:35:26.052 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:26 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:26 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:26.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:27 np0005540826 nova_compute[229148]: 2025-12-01 10:35:27.603 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:27.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:28.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:29.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:30.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:31 np0005540826 nova_compute[229148]: 2025-12-01 10:35:31.053 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:35:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:31.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:35:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:32.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:32 np0005540826 nova_compute[229148]: 2025-12-01 10:35:32.605 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:33.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:34.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:35.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:36 np0005540826 nova_compute[229148]: 2025-12-01 10:35:36.056 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:36.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:37 np0005540826 nova_compute[229148]: 2025-12-01 10:35:37.607 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:37.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:37 np0005540826 podman[254607]: 2025-12-01 10:35:37.964594244 +0000 UTC m=+0.050744706 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:35:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:38.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:39.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:40.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:41 np0005540826 nova_compute[229148]: 2025-12-01 10:35:41.120 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:41.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:42.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:42 np0005540826 nova_compute[229148]: 2025-12-01 10:35:42.609 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:43.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:44.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:45.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:46 np0005540826 nova_compute[229148]: 2025-12-01 10:35:46.121 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:46.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:47 np0005540826 nova_compute[229148]: 2025-12-01 10:35:47.611 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:35:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:47.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:35:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:48.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:48 np0005540826 podman[254631]: 2025-12-01 10:35:48.989594362 +0000 UTC m=+0.068629536 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  1 05:35:48 np0005540826 podman[254632]: 2025-12-01 10:35:48.998728181 +0000 UTC m=+0.083305124 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:35:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:49.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:50.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:51 np0005540826 nova_compute[229148]: 2025-12-01 10:35:51.123 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:51.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:52.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:52 np0005540826 nova_compute[229148]: 2025-12-01 10:35:52.614 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:53.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:54.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:35:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:55.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:35:56 np0005540826 nova_compute[229148]: 2025-12-01 10:35:56.124 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:56.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:57 np0005540826 nova_compute[229148]: 2025-12-01 10:35:57.618 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:35:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:57.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:58.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:35:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:59.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:00.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:01 np0005540826 nova_compute[229148]: 2025-12-01 10:36:01.126 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:01.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:02.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:02 np0005540826 nova_compute[229148]: 2025-12-01 10:36:02.620 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:03.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:04.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:36:04.569 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:36:04.569 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:36:04.570 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:05.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.127 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.133 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.133 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.133 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.151 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.151 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.151 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:06 np0005540826 nova_compute[229148]: 2025-12-01 10:36:06.152 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:36:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:06.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:07 np0005540826 nova_compute[229148]: 2025-12-01 10:36:07.123 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:36:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3551098333' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:36:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:36:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3551098333' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:36:07 np0005540826 nova_compute[229148]: 2025-12-01 10:36:07.621 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:07.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:08 np0005540826 nova_compute[229148]: 2025-12-01 10:36:08.126 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:08.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:08 np0005540826 podman[254712]: 2025-12-01 10:36:08.985965024 +0000 UTC m=+0.054612113 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:36:09 np0005540826 nova_compute[229148]: 2025-12-01 10:36:09.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:09.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:10 np0005540826 nova_compute[229148]: 2025-12-01 10:36:10.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:10 np0005540826 nova_compute[229148]: 2025-12-01 10:36:10.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:10.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:11 np0005540826 nova_compute[229148]: 2025-12-01 10:36:11.173 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:11.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:12.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:12 np0005540826 nova_compute[229148]: 2025-12-01 10:36:12.623 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:13 np0005540826 nova_compute[229148]: 2025-12-01 10:36:13.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:13.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:14.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:15.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:16 np0005540826 nova_compute[229148]: 2025-12-01 10:36:16.175 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:16.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:17 np0005540826 nova_compute[229148]: 2025-12-01 10:36:17.626 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:17.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.146 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.146 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.147 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:36:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:18.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:18 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:36:18 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3320979075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.556 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.704 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.705 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4853MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.705 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.705 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.793 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.793 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:36:18 np0005540826 nova_compute[229148]: 2025-12-01 10:36:18.810 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:36:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:36:19 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/309041401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:36:19 np0005540826 nova_compute[229148]: 2025-12-01 10:36:19.231 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:36:19 np0005540826 nova_compute[229148]: 2025-12-01 10:36:19.236 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:36:19 np0005540826 nova_compute[229148]: 2025-12-01 10:36:19.252 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:36:19 np0005540826 nova_compute[229148]: 2025-12-01 10:36:19.253 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:36:19 np0005540826 nova_compute[229148]: 2025-12-01 10:36:19.253 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:19.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:19 np0005540826 podman[254807]: 2025-12-01 10:36:19.967838865 +0000 UTC m=+0.051082375 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  1 05:36:19 np0005540826 podman[254808]: 2025-12-01 10:36:19.997905871 +0000 UTC m=+0.078371611 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:36:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:20.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:21 np0005540826 nova_compute[229148]: 2025-12-01 10:36:21.177 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:21.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:22.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:22 np0005540826 nova_compute[229148]: 2025-12-01 10:36:22.629 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:23.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:24.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:25 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:25 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:25 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:25.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:26 np0005540826 nova_compute[229148]: 2025-12-01 10:36:26.211 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:26.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:27 np0005540826 nova_compute[229148]: 2025-12-01 10:36:27.631 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:27 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:27 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:27 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:27.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:28.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:29 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:29 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:29 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:29.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:30.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:36:31 np0005540826 nova_compute[229148]: 2025-12-01 10:36:31.214 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:31 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:31 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:31 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:31.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:32.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:32 np0005540826 nova_compute[229148]: 2025-12-01 10:36:32.634 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:33 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:33 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:33 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:33.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:34.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:35 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:35 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:35 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:35.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:36 np0005540826 nova_compute[229148]: 2025-12-01 10:36:36.218 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:36.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:37 np0005540826 nova_compute[229148]: 2025-12-01 10:36:37.636 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:37 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:37 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:37 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:37.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:38.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:39 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:39 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:39 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:39.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:39 np0005540826 podman[254996]: 2025-12-01 10:36:39.995839631 +0000 UTC m=+0.073175590 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  1 05:36:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:40.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:41 np0005540826 nova_compute[229148]: 2025-12-01 10:36:41.218 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:41 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:41 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:41 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:41.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:42.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:42 np0005540826 nova_compute[229148]: 2025-12-01 10:36:42.638 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:43 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:43 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:43 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:43.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:44.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:45 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:45 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:45 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:45.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:46 np0005540826 nova_compute[229148]: 2025-12-01 10:36:46.262 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:46.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:47 np0005540826 nova_compute[229148]: 2025-12-01 10:36:47.641 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:47 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:47 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:47 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:47.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:49 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:49 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:49 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:49.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:50 np0005540826 podman[255048]: 2025-12-01 10:36:50.925549092 +0000 UTC m=+0.061581389 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec  1 05:36:50 np0005540826 podman[255049]: 2025-12-01 10:36:50.955051913 +0000 UTC m=+0.095671575 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:36:51 np0005540826 nova_compute[229148]: 2025-12-01 10:36:51.305 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:51 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:51 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:51 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:51.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:52.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:52 np0005540826 nova_compute[229148]: 2025-12-01 10:36:52.643 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:53 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:53 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:36:53 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:53.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:36:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:54.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:55 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:55 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:55 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:55.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:56 np0005540826 nova_compute[229148]: 2025-12-01 10:36:56.308 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:56.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:57 np0005540826 nova_compute[229148]: 2025-12-01 10:36:57.645 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:36:57 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:57 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:57 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:57.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:58.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:59 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:36:59 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:59 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:59.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:00.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:01 np0005540826 nova_compute[229148]: 2025-12-01 10:37:01.311 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:01 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:01 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:01 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:01.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:02.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:02 np0005540826 nova_compute[229148]: 2025-12-01 10:37:02.648 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:03 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:03 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:03 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:03.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:37:04.571 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:37:04.572 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:37:04.572 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:04.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:05 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:05 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:05 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:05.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:06 np0005540826 nova_compute[229148]: 2025-12-01 10:37:06.332 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:06.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.777264) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426777304, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1504, "num_deletes": 251, "total_data_size": 3928196, "memory_usage": 3990016, "flush_reason": "Manual Compaction"}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426795011, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2533730, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40423, "largest_seqno": 41922, "table_properties": {"data_size": 2527310, "index_size": 3619, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13565, "raw_average_key_size": 20, "raw_value_size": 2514458, "raw_average_value_size": 3714, "num_data_blocks": 158, "num_entries": 677, "num_filter_entries": 677, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585297, "oldest_key_time": 1764585297, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 17903 microseconds, and 7516 cpu microseconds.
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.795155) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2533730 bytes OK
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.795188) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.796968) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.796989) EVENT_LOG_v1 {"time_micros": 1764585426796982, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.797011) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3921173, prev total WAL file size 3921173, number of live WAL files 2.
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.798947) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2474KB)], [78(14MB)]
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426799008, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17434150, "oldest_snapshot_seqno": -1}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6940 keys, 15161307 bytes, temperature: kUnknown
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426893599, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 15161307, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15115640, "index_size": 27187, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182857, "raw_average_key_size": 26, "raw_value_size": 14991316, "raw_average_value_size": 2160, "num_data_blocks": 1068, "num_entries": 6940, "num_filter_entries": 6940, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582559, "oldest_key_time": 0, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d6e1f97e-eb58-41c1-b758-cb672eabd75e", "db_session_id": "KSHNHU57VR1LHZ6EZUM4", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.893990) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 15161307 bytes
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.895389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.1 rd, 160.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 14.2 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(12.9) write-amplify(6.0) OK, records in: 7456, records dropped: 516 output_compression: NoCompression
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.895420) EVENT_LOG_v1 {"time_micros": 1764585426895406, "job": 48, "event": "compaction_finished", "compaction_time_micros": 94725, "compaction_time_cpu_micros": 55942, "output_level": 6, "num_output_files": 1, "total_output_size": 15161307, "num_input_records": 7456, "num_output_records": 6940, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426896651, "job": 48, "event": "table_file_deletion", "file_number": 80}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426902100, "job": 48, "event": "table_file_deletion", "file_number": 78}
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.798829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.902306) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.902317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.902322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.902328) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: rocksdb: (Original Log Time 2025/12/01-10:37:06.902333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:37:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4194406903' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:37:07 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:37:07 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4194406903' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:37:07 np0005540826 nova_compute[229148]: 2025-12-01 10:37:07.253 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:07 np0005540826 nova_compute[229148]: 2025-12-01 10:37:07.651 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:07 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:07 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:07 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:07.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.111 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.145 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.145 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:08 np0005540826 nova_compute[229148]: 2025-12-01 10:37:08.145 229152 DEBUG nova.compute.manager [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:37:08 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:08 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:08 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:08.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:09 np0005540826 nova_compute[229148]: 2025-12-01 10:37:09.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:09 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:09 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:37:09 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:09.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:37:10 np0005540826 nova_compute[229148]: 2025-12-01 10:37:10.106 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:10 np0005540826 nova_compute[229148]: 2025-12-01 10:37:10.109 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:10 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:10 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:10 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:10.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:10 np0005540826 podman[255119]: 2025-12-01 10:37:10.968096342 +0000 UTC m=+0.050171662 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:37:11 np0005540826 nova_compute[229148]: 2025-12-01 10:37:11.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:11 np0005540826 nova_compute[229148]: 2025-12-01 10:37:11.383 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:11 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:11 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:11 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:11 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:11.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:12 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:12 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:12 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:12.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:12 np0005540826 nova_compute[229148]: 2025-12-01 10:37:12.654 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:13 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:13 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:13 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:13.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:14 np0005540826 nova_compute[229148]: 2025-12-01 10:37:14.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:14 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:14 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:14 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:14.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:15 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:15 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:15 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:15.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:16 np0005540826 nova_compute[229148]: 2025-12-01 10:37:16.385 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:16 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:16 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:16 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:16.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:16 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:17 np0005540826 nova_compute[229148]: 2025-12-01 10:37:17.683 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:17 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:17 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:17 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:17.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:18 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:18 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:37:18 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:18.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.110 229152 DEBUG oslo_service.periodic_task [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.136 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.137 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.137 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.137 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:37:19 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:37:19 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2572138056' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.669 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.880 229152 WARNING nova.virt.libvirt.driver [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.882 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4881MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.883 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.884 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.949 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.949 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.968 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing inventories for resource provider 19014d04-db84-4f3d-831b-084720e9168c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:37:19 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:19 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:19 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:19.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.998 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating ProviderTree inventory for provider 19014d04-db84-4f3d-831b-084720e9168c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:37:19 np0005540826 nova_compute[229148]: 2025-12-01 10:37:19.998 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Updating inventory in ProviderTree for provider 19014d04-db84-4f3d-831b-084720e9168c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.032 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing aggregate associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.069 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Refreshing trait associations for resource provider 19014d04-db84-4f3d-831b-084720e9168c, traits: COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE42,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.087 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:37:20 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:37:20 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3619192831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.582 229152 DEBUG oslo_concurrency.processutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.588 229152 DEBUG nova.compute.provider_tree [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed in ProviderTree for provider: 19014d04-db84-4f3d-831b-084720e9168c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:37:20 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:20 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec  1 05:37:20 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.612 229152 DEBUG nova.scheduler.client.report [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Inventory has not changed for provider 19014d04-db84-4f3d-831b-084720e9168c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.613 229152 DEBUG nova.compute.resource_tracker [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:37:20 np0005540826 nova_compute[229148]: 2025-12-01 10:37:20.613 229152 DEBUG oslo_concurrency.lockutils [None req-2aeceae9-3ee6-40eb-8a6f-2696d09d54a8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:21 np0005540826 nova_compute[229148]: 2025-12-01 10:37:21.429 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:21 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:21 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:21 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:21 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:21.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:22 np0005540826 podman[255197]: 2025-12-01 10:37:22.024431475 +0000 UTC m=+0.098984588 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Dec  1 05:37:22 np0005540826 podman[255198]: 2025-12-01 10:37:22.07593394 +0000 UTC m=+0.147085998 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:37:22 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:22 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:22 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:22.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:22 np0005540826 nova_compute[229148]: 2025-12-01 10:37:22.684 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:23 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:23 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:23 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:23.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:24 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:24 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:24 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:24.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:26.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:26 np0005540826 nova_compute[229148]: 2025-12-01 10:37:26.431 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:26 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:26 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:26 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:26.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:26 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:27 np0005540826 nova_compute[229148]: 2025-12-01 10:37:27.687 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:28.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:28 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:28 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:28 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:28.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:30.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:30 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:30 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:30 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:31 np0005540826 nova_compute[229148]: 2025-12-01 10:37:31.438 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:31 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:32.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:32 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:32 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:32 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:32.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:32 np0005540826 nova_compute[229148]: 2025-12-01 10:37:32.720 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:34.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:34 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:34 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:34 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:34.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:36.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:36 np0005540826 nova_compute[229148]: 2025-12-01 10:37:36.440 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:36 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:36 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:36 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:36.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:36 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:37 np0005540826 nova_compute[229148]: 2025-12-01 10:37:37.755 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:38.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:38 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:38 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:37:38 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:38.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:37:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:37:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:38 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:37:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:40.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:40 np0005540826 systemd-logind[787]: New session 58 of user zuul.
Dec  1 05:37:40 np0005540826 systemd[1]: Started Session 58 of User zuul.
Dec  1 05:37:40 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:40 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:40 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:40.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:41 np0005540826 podman[255397]: 2025-12-01 10:37:41.430758828 +0000 UTC m=+0.076086463 container health_status 9ce59c2758d021c154e97f8a3178b73d8f43045c59b9ba6fd93369a760a49136 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Dec  1 05:37:41 np0005540826 nova_compute[229148]: 2025-12-01 10:37:41.485 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:41 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:42.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:42 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:42 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:42 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:42.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:42 np0005540826 nova_compute[229148]: 2025-12-01 10:37:42.758 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:43 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:37:43 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1668722920' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:37:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:44.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:44 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:44 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:44 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:44.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:45 np0005540826 ceph-mon[80026]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:46.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:46 np0005540826 nova_compute[229148]: 2025-12-01 10:37:46.486 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:46 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:46 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:46 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:46.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:46 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:47 np0005540826 ovs-vsctl[255739]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 05:37:47 np0005540826 nova_compute[229148]: 2025-12-01 10:37:47.760 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:48.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:48 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 05:37:48 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 05:37:48 np0005540826 virtqemud[228647]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:37:48 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:48 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:48 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:48.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:48 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: cache status {prefix=cache status} (starting...)
Dec  1 05:37:48 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: client ls {prefix=client ls} (starting...)
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540826 lvm[256063]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:37:49 np0005540826 lvm[256063]: VG ceph_vg0 finished
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  1 05:37:49 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1216379499' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 05:37:49 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:50.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  1 05:37:50 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1911794276' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:50 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:50 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:50.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:50 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  1 05:37:50 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3545264155' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: ops {prefix=ops} (starting...)
Dec  1 05:37:50 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2024571894' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2815357942' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 05:37:51 np0005540826 nova_compute[229148]: 2025-12-01 10:37:51.531 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:51 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: session ls {prefix=session ls} (starting...)
Dec  1 05:37:51 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi Can't run that command on an inactive MDS!
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2909162309' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:37:51 np0005540826 ceph-mds[84503]: mds.cephfs.compute-1.ijlzoi asok_command: status {prefix=status} (starting...)
Dec  1 05:37:51 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:52.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/708766779' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/369458245' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1353961900' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:37:52 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:52 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:52 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:52.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:52 np0005540826 nova_compute[229148]: 2025-12-01 10:37:52.761 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3812692844' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:37:52 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3369796777' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:37:52 np0005540826 podman[256627]: 2025-12-01 10:37:52.973830784 +0000 UTC m=+0.057149168 container health_status 2f6a34a2eb1139886d5df897b4264e2aa17883bd121a8757ac91426f6d44356f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:37:53 np0005540826 podman[256630]: 2025-12-01 10:37:53.011460429 +0000 UTC m=+0.091002278 container health_status 6b06bbb7dde72e1297fe084ecc5611549b12415c8a2a7d771b9728c6924dbf55 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1197698938' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1671515067' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:37:53 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2033843320' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:37:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:54.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  1 05:37:54 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1949553522' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 05:37:54 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:54 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:54 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:54.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:54 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:37:54 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3150585897' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:37:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:37:55 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4131228724' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7d352c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7bfe1e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932760 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b5290800 session 0x55e0b524e5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.795988083s of 21.811857224s, submitted: 4
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933024 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81887232 unmapped: 909312 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934536 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81895424 unmapped: 901120 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935305 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.926730156s of 14.144864082s, submitted: 15
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b7d2e000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935325 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935193 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.862897873s of 11.870977402s, submitted: 2
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935325 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935341 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934582 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.008006096s of 12.044129372s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b53a45a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 62.043544769s of 62.273132324s, submitted: 2
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7f1af00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934159 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.962292671s of 10.977412224s, submitted: 4
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934291 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 688128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 688128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934159 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933991 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 1712128 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 1687552 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.645601273s of 14.703873634s, submitted: 13
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b5e52b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b7800 session 0x55e0b524e5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934011 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.513740540s of 18.526220322s, submitted: 1
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935803 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 1662976 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 1646592 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 7887 writes, 31K keys, 7887 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 7887 writes, 1549 syncs, 5.09 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 761 writes, 1336 keys, 761 commit groups, 1.0 writes per commit group, ingest: 0.56 MB, 0.00 MB/s#012Interval WAL: 761 writes, 374 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e0b36fd350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 1613824 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935803 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.910443306s of 12.943789482s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 1589248 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937167 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 1564672 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 1556480 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 1556480 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b81f1860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000049s
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936444 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.295097351s of 29.695419312s, submitted: 10
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 1540096 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938104 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 1531904 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 1515520 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937345 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.566009521s of 13.605023384s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 1507328 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b8283680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 1490944 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 1482752 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 1482752 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937365 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 1466368 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.388652802s of 40.391891479s, submitted: 1
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937497 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 1449984 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 1433600 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937513 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 1400832 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936754 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.199851990s of 13.357616425s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b7da9e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7da94a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936774 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.696914673s of 14.699798584s, submitted: 1
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937038 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 1392640 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940078 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.845451355s of 12.939780235s, submitted: 14
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939930 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca86000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.911399841s of 24.919387817s, submitted: 2
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 1146880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 1138688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 1130496 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 1122304 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b7da8960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 1114112 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 1105920 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b4d35e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939798 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 1097728 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 31.330074310s of 32.056125641s, submitted: 220
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 1089536 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 1089536 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939946 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 1081344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940078 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.306317329s of 12.338130951s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938896 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b7d354a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b4a8d2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938616 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 55.401966095s of 55.432224274s, submitted: 9
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 1007616 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940240 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 1040384 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940408 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 1024000 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.045467377s of 12.250482559s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 1015808 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 958464 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940392 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 958464 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83959808 unmapped: 933888 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 917504 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b832a780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.239133835s of 22.275657654s, submitted: 10
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 901120 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 884736 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939685 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 868352 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939685 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84058112 unmapped: 835584 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.702522278s of 14.757088661s, submitted: 10
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939385 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b82b0d20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939537 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84082688 unmapped: 811008 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.179632187s of 20.300897598s, submitted: 1
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84099072 unmapped: 794624 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84107264 unmapped: 786432 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941197 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 770048 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941197 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 761856 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.630328178s of 14.677761078s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940897 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b82db0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941049 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941049 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.983519554s of 15.986623764s, submitted: 1
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fc676000/0x0/0x4ffc00000, data 0xe7bdf/0x196000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 753664 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 138 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7f1be00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84246528 unmapped: 17432576 heap: 101679104 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 138 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82a7680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84320256 unmapped: 25755648 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1061229 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84320256 unmapped: 25755648 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 140 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b7d7da40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb669000/0x0/0x4ffc00000, data 0x10ee025/0x11a0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064251 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84369408 unmapped: 25706496 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb667000/0x0/0x4ffc00000, data 0x10f01a6/0x11a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 140 heartbeat osd_stat(store_statfs(0x4fb667000/0x0/0x4ffc00000, data 0x10f01a6/0x11a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.795370102s of 12.924592018s, submitted: 30
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84385792 unmapped: 25690112 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066185 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066185 data_alloc: 218103808 data_used: 151552
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84393984 unmapped: 25681920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84402176 unmapped: 25673728 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.582818031s of 11.626168251s, submitted: 21
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1066169 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84410368 unmapped: 25665536 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84418560 unmapped: 25657344 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84418560 unmapped: 25657344 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84426752 unmapped: 25649152 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b7d443c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065446 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.755990982s of 18.777509689s, submitted: 5
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb665000/0x0/0x4ffc00000, data 0x10f21ce/0x11a7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 84434944 unmapped: 25640960 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065578 data_alloc: 218103808 data_used: 155648
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7bb7680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b5254d20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 91930624 unmapped: 18145280 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb661000/0x0/0x4ffc00000, data 0x10f4310/0x11aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,3])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 17489920 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b82a6960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b82b1a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170125 data_alloc: 218103808 data_used: 6971392
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92069888 unmapped: 18006016 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b59094a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fabdc000/0x0/0x4ffc00000, data 0x1b76508/0x1c2e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bff0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fabdc000/0x0/0x4ffc00000, data 0x1b76508/0x1c2e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92086272 unmapped: 17989632 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170125 data_alloc: 218103808 data_used: 6971392
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b8230960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781499863s of 13.647070885s, submitted: 47
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b8248960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92397568 unmapped: 17678336 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 17670144 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 17661952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101523456 unmapped: 8552448 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251826 data_alloc: 234881024 data_used: 16932864
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101539840 unmapped: 8536064 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101629952 unmapped: 8445952 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251978 data_alloc: 234881024 data_used: 16936960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101646336 unmapped: 8429568 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fabb5000/0x0/0x4ffc00000, data 0x1b9c540/0x1c56000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.211468697s of 13.280833244s, submitted: 18
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 101679104 unmapped: 8396800 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103358464 unmapped: 6717440 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279464 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b830be00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103358464 unmapped: 6717440 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104243200 unmapped: 5832704 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280690 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280690 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 5824512 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.150278091s of 12.817354202s, submitted: 32
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104267776 unmapped: 5808128 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b47d0c00 session 0x55e0b82a7c20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104284160 unmapped: 5791744 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280278 data_alloc: 234881024 data_used: 17043456
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82301e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b82d8000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b82483c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104292352 unmapped: 5783552 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b7d7c000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b82b14a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 4603904 heap: 110075904 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0x1f19540/0x1fd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82d9c20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.499177933s of 15.732673645s, submitted: 10
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b4d35e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b823a1e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b859d680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b823be00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b81f14a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302476 data_alloc: 234881024 data_used: 17567744
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60e000/0x0/0x4ffc00000, data 0x2143550/0x21fe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104882176 unmapped: 6815744 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60e000/0x0/0x4ffc00000, data 0x2143550/0x21fe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60c000/0x0/0x4ffc00000, data 0x2144550/0x21ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa60c000/0x0/0x4ffc00000, data 0x2144550/0x21ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104931328 unmapped: 6766592 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b859d2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104538112 unmapped: 7159808 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305444 data_alloc: 234881024 data_used: 17567744
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104554496 unmapped: 7143424 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104554496 unmapped: 7143424 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.507975578s of 10.574006081s, submitted: 14
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104546304 unmapped: 7151616 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104546304 unmapped: 7151616 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 6635520 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313036 data_alloc: 234881024 data_used: 18292736
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106315776 unmapped: 5382144 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106315776 unmapped: 5382144 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1322308 data_alloc: 234881024 data_used: 19689472
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.600978851s of 10.616786003s, submitted: 5
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa5e9000/0x0/0x4ffc00000, data 0x2168550/0x2223000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106348544 unmapped: 5349376 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 1728512 heap: 111697920 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343706 data_alloc: 234881024 data_used: 19714048
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 5341184 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112779264 unmapped: 4915200 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112779264 unmapped: 4915200 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400442 data_alloc: 234881024 data_used: 20185088
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2969550/0x2a24000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7d7d680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07000 session 0x55e0b7940f00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 4882432 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400458 data_alloc: 234881024 data_used: 20185088
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.637441635s of 11.983428001s, submitted: 64
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110370816 unmapped: 7323648 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eef800 session 0x55e0b7d2f2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290388 data_alloc: 234881024 data_used: 17440768
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9698000/0x0/0x4ffc00000, data 0x1f1a540/0x1fd4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110379008 unmapped: 7315456 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b7d454a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7d7d2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103915520 unmapped: 13778944 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122142 data_alloc: 218103808 data_used: 7475200
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b85b9400 session 0x55e0b7a021e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114345 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103874560 unmapped: 13819904 heap: 117694464 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 32.091907501s of 32.341941833s, submitted: 45
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7d45a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7bfe3c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7cc5a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b8231860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7f1a5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b859c960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167432 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7d521e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103653376 unmapped: 22511616 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167432 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290800 session 0x55e0b7d7cd20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b523ba40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7d7c5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103636992 unmapped: 22528000 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 103702528 unmapped: 22462464 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1214228 data_alloc: 234881024 data_used: 14315520
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1214228 data_alloc: 234881024 data_used: 14315520
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9dfe000/0x0/0x4ffc00000, data 0x17b64ce/0x186e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7d2e3c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fdc00 session 0x55e0b7d2e1e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 106070016 unmapped: 20094976 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.142061234s of 22.214815140s, submitted: 17
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110362624 unmapped: 15802368 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255460 data_alloc: 234881024 data_used: 14860288
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110510080 unmapped: 15654912 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f994e000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267452 data_alloc: 234881024 data_used: 15208448
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109092864 unmapped: 17072128 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263052 data_alloc: 234881024 data_used: 15216640
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.508072853s of 12.713724136s, submitted: 69
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109019136 unmapped: 17145856 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263068 data_alloc: 234881024 data_used: 15212544
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109051904 unmapped: 17113088 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109084672 unmapped: 17080320 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 17063936 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264284 data_alloc: 234881024 data_used: 15290368
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109117440 unmapped: 17047552 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.391874313s of 11.401729584s, submitted: 3
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b4a8c1e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7bb7c20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9956000/0x0/0x4ffc00000, data 0x1c5e4ce/0x1d16000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104800256 unmapped: 21364736 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1121954 data_alloc: 218103808 data_used: 7360512
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104808448 unmapped: 21356544 heap: 126164992 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 26.203746796s of 26.265848160s, submitted: 25
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7bb8780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188520 data_alloc: 218103808 data_used: 6967296
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c37000/0x0/0x4ffc00000, data 0x197d4ce/0x1a35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b78b3c20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188520 data_alloc: 218103808 data_used: 6967296
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd800 session 0x55e0b52334a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104218624 unmapped: 26148864 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7943e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b6e87e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 26116096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 26116096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250374 data_alloc: 234881024 data_used: 15360000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 9259 writes, 35K keys, 9259 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 9259 writes, 2171 syncs, 4.26 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1372 writes, 3774 keys, 1372 commit groups, 1.0 writes per commit group, ingest: 3.39 MB, 0.01 MB/s#012Interval WAL: 1372 writes, 622 syncs, 2.21 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 21454848 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250374 data_alloc: 234881024 data_used: 15360000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9c36000/0x0/0x4ffc00000, data 0x197d4de/0x1a36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 21446656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.684480667s of 18.732368469s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111058944 unmapped: 19308544 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 19161088 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310776 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e2000/0x0/0x4ffc00000, data 0x21d14de/0x228a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 19931136 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310594 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b52550e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bfe960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b5d7b2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd000 session 0x55e0b79d3e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.777948380s of 22.418426514s, submitted: 57
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b79d3c20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b79d3860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fc800 session 0x55e0b79d3680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7bb4780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b6f4da40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c1000/0x0/0x4ffc00000, data 0x21f14ee/0x22ab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314878 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110387200 unmapped: 19980288 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c1000/0x0/0x4ffc00000, data 0x21f14ee/0x22ab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 19963904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5290c00 session 0x55e0b7af2960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318296 data_alloc: 234881024 data_used: 15601664
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 19947520 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110444544 unmapped: 19922944 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318600 data_alloc: 234881024 data_used: 15634432
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110452736 unmapped: 19914752 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318600 data_alloc: 234881024 data_used: 15634432
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110460928 unmapped: 19906560 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f93c0000/0x0/0x4ffc00000, data 0x21f1511/0x22ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.989015579s of 22.041868210s, submitted: 12
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 16752640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 16736256 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1349310 data_alloc: 234881024 data_used: 15659008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8fe0000/0x0/0x4ffc00000, data 0x25d1511/0x268c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 17399808 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1349976 data_alloc: 234881024 data_used: 15659008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291800 session 0x55e0b7bb4000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5234780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8fe0000/0x0/0x4ffc00000, data 0x25d1511/0x268c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 17350656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 17350656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b82dab40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9329000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316518 data_alloc: 234881024 data_used: 15597568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9329000/0x0/0x4ffc00000, data 0x21d34de/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7bb83c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113033216 unmapped: 17334272 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b52530e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.346494675s of 14.904012680s, submitted: 67
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b7af21e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 22896640 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1135491 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107331584 unmapped: 23035904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6f4bc20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fd400 session 0x55e0b7af0f00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7af10e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b515b4a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.613090515s of 25.649431229s, submitted: 10
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 21266432 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5d7a5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167807 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7af0d20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107003904 unmapped: 23363584 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194620 data_alloc: 234881024 data_used: 10178560
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 107257856 unmapped: 23109632 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.353801727s of 17.207458496s, submitted: 14
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa116000/0x0/0x4ffc00000, data 0x149e4ce/0x1556000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110215168 unmapped: 20152320 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 108879872 unmapped: 21487616 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250646 data_alloc: 234881024 data_used: 10629120
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 19898368 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250662 data_alloc: 234881024 data_used: 10629120
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a57000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 19890176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 19881984 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718fc00 session 0x55e0b81a0960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7d2fe00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b5e4c000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7d45860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.747030258s of 24.820497513s, submitted: 77
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325345 data_alloc: 234881024 data_used: 10629120
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109453312 unmapped: 20914176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b82825a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b7af0b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b7bb81e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b6e87680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b6f4b2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 20897792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109469696 unmapped: 20897792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109477888 unmapped: 20889600 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1309713 data_alloc: 234881024 data_used: 10633216
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109486080 unmapped: 20881408 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291400 session 0x55e0b5e530e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18400 session 0x55e0b732e780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367489 data_alloc: 234881024 data_used: 19181568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 14770176 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a29400 session 0x55e0b5157680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367489 data_alloc: 234881024 data_used: 19181568
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 14753792 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9205000/0x0/0x4ffc00000, data 0x23ad540/0x2467000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.074026108s of 17.169523239s, submitted: 31
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116539392 unmapped: 13828096 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119160832 unmapped: 11206656 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 10911744 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1425171 data_alloc: 234881024 data_used: 19357696
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 10813440 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5d45400 session 0x55e0b5909680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 10813440 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 10780672 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aca000/0x0/0x4ffc00000, data 0x2ae8540/0x2ba2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119619584 unmapped: 10747904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119619584 unmapped: 10747904 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422259 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118882304 unmapped: 11485184 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 11476992 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422259 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118956032 unmapped: 11411456 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 11403264 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.915831566s of 17.032249451s, submitted: 304
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422427 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x2b09540/0x2bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118996992 unmapped: 11370496 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119005184 unmapped: 11362304 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.685111046s of 15.706851959s, submitted: 6
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 11345920 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119029760 unmapped: 11337728 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 11329536 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119046144 unmapped: 11321344 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422483 data_alloc: 234881024 data_used: 19361792
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x2b0c540/0x2bc6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119054336 unmapped: 11313152 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.836561203s of 16.850162506s, submitted: 4
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b793ef00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6c52400 session 0x55e0b5230b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254569 data_alloc: 234881024 data_used: 10612736
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5e000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 14254080 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5e000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254781 data_alloc: 234881024 data_used: 10612736
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5f000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b7af01e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.366408348s of 11.507596970s, submitted: 22
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7a030e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 14868480 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9a5f000/0x0/0x4ffc00000, data 0x1b554ce/0x1c0d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113172480 unmapped: 17195008 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b793fe00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150203 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 17154048 heap: 130367488 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.599931717s of 29.781423569s, submitted: 23
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7d2e960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711c000 session 0x55e0b5e52000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x16194f7/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194498 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b793f680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f4b0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 21897216 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b7af05a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b6f4af00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203823 data_alloc: 218103808 data_used: 7593984
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 21889024 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112164864 unmapped: 21880832 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: mgrc ms_handle_reset ms_handle_reset con 0x55e0b711d400
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: mgrc handle_mgr_configure stats_period=5
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b82d90e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.048700333s of 12.292609215s, submitted: 36
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b515cd20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9f9a000/0x0/0x4ffc00000, data 0x1619530/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1231811 data_alloc: 234881024 data_used: 11739136
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110141440 unmapped: 23904256 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5e52000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154953 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154953 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110149632 unmapped: 23896064 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6e86960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6eee400 session 0x55e0b6e861e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6e872c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.875237465s of 12.953323364s, submitted: 29
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6e86000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b5d7b2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b718f800 session 0x55e0b5d7a780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6f492c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f48960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0f0000/0x0/0x4ffc00000, data 0x14c2507/0x157c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192120 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b6f4b0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa06d000/0x0/0x4ffc00000, data 0x1545540/0x15ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6f4af00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 23642112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07000 session 0x55e0b7af01e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa06d000/0x0/0x4ffc00000, data 0x1545540/0x15ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5230b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193988 data_alloc: 218103808 data_used: 6451200
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110723072 unmapped: 23322624 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa049000/0x0/0x4ffc00000, data 0x1569540/0x1623000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b5157680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b523ad20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 22470656 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.350914001s of 11.770071983s, submitted: 28
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162740 data_alloc: 218103808 data_used: 6447104
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0cd000/0x0/0x4ffc00000, data 0x111c4ce/0x11d4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109666304 unmapped: 24379392 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6e87a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa4bc000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1159728 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06000 session 0x55e0b6a14960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b4d34780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 24371200 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b7bb8960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 32.203327179s of 32.274513245s, submitted: 15
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b5233860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 110624768 unmapped: 23420928 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b515d0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b82a83c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b515d2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5157860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e87000/0x0/0x4ffc00000, data 0x172c4de/0x17e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218431 data_alloc: 218103808 data_used: 6447104
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109953024 unmapped: 24092672 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 24076288 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b53a4000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 111017984 unmapped: 23027712 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 21078016 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255024 data_alloc: 234881024 data_used: 11526144
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255024 data_alloc: 234881024 data_used: 11526144
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112984064 unmapped: 21061632 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9e86000/0x0/0x4ffc00000, data 0x172c501/0x17e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.454790115s of 18.098155975s, submitted: 35
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318202 data_alloc: 234881024 data_used: 11956224
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116703232 unmapped: 17342464 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 18079744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328094 data_alloc: 234881024 data_used: 12349440
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f919d000/0x0/0x4ffc00000, data 0x2004501/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116269056 unmapped: 17776640 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326470 data_alloc: 234881024 data_used: 12361728
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115515392 unmapped: 18530304 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9177000/0x0/0x4ffc00000, data 0x202b501/0x20e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.622363091s of 14.404953957s, submitted: 103
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c07400 session 0x55e0b6f4ba40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 115523584 unmapped: 18522112 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325768 data_alloc: 234881024 data_used: 12357632
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112304128 unmapped: 21741568 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6e86b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112312320 unmapped: 21733376 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112320512 unmapped: 21725184 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ab000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170924 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112328704 unmapped: 21716992 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f4b0e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5157a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b5156960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b7c00 session 0x55e0b5238b40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.455039978s of 27.564855576s, submitted: 39
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7af01e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b8a000/0x0/0x4ffc00000, data 0x161a4ce/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112549888 unmapped: 21495808 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b5a603c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b5a605a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112558080 unmapped: 21487616 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b73401e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b8a000/0x0/0x4ffc00000, data 0x161a4ce/0x16d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b5d7b2c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112721920 unmapped: 21323776 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 21315584 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 21315584 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254877 data_alloc: 234881024 data_used: 11730944
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112893952 unmapped: 21151744 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254877 data_alloc: 234881024 data_used: 11730944
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 21143552 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.325626373s of 18.633775711s, submitted: 19
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293411 data_alloc: 234881024 data_used: 11763712
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9b65000/0x0/0x4ffc00000, data 0x163e4de/0x16f7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,7,1])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 116752384 unmapped: 17293312 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f96a2000/0x0/0x4ffc00000, data 0x1b014de/0x1bba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118398976 unmapped: 15646720 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118398976 unmapped: 15646720 heap: 134045696 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b515b680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b515af00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b732e780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b732f680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b732f860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b7af34a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118988800 unmapped: 28704768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b711d000 session 0x55e0b523a780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6000 session 0x55e0b5e52f00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b6f48f00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1363146 data_alloc: 234881024 data_used: 12959744
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d9e000/0x0/0x4ffc00000, data 0x24054de/0x24be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b524e1e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 30056448 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367903 data_alloc: 234881024 data_used: 12959744
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 30056448 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 23248896 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 23216128 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 23216128 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1431135 data_alloc: 234881024 data_used: 22302720
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 23183360 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1431135 data_alloc: 234881024 data_used: 22302720
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 23150592 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8d7a000/0x0/0x4ffc00000, data 0x24294de/0x24e2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.316621780s of 22.041751862s, submitted: 66
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 126763008 unmapped: 20930560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 19341312 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128819200 unmapped: 18874368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475013 data_alloc: 234881024 data_used: 22556672
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128819200 unmapped: 18874368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a0f000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128827392 unmapped: 18866176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a0f000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1475029 data_alloc: 234881024 data_used: 22556672
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128860160 unmapped: 18833408 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128860160 unmapped: 18833408 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.479086876s of 10.626307487s, submitted: 52
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128892928 unmapped: 18800640 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128221184 unmapped: 19472384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128221184 unmapped: 19472384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f8a25000/0x0/0x4ffc00000, data 0x277e4de/0x2837000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467973 data_alloc: 234881024 data_used: 22556672
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128229376 unmapped: 19464192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128229376 unmapped: 19464192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b5252d20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b515d680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 128237568 unmapped: 19456000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7bff860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307237 data_alloc: 234881024 data_used: 12959744
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6ee8d20
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6ee83c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 121937920 unmapped: 25755648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f969c000/0x0/0x4ffc00000, data 0x1b074de/0x1bc0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.750218391s of 10.112176895s, submitted: 57
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b4a8d680
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa088000/0x0/0x4ffc00000, data 0x111c4ce/0x11d4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1189236 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b5e52000
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118710272 unmapped: 28983296 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f49a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b7cc45a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b7c06800 session 0x55e0b4a8d860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 24.401563644s of 24.434398651s, submitted: 11
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b6f46780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198392 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119095296 unmapped: 28598272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b6f47a40
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b6f47e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119111680 unmapped: 28581888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78b6400 session 0x55e0b6f474a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4792000 session 0x55e0b6f461e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd7000/0x0/0x4ffc00000, data 0x11cd4ce/0x1285000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 29155328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207611 data_alloc: 218103808 data_used: 7229440
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207611 data_alloc: 218103808 data_used: 7229440
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f9fd6000/0x0/0x4ffc00000, data 0x11cd4de/0x1286000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 29138944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118562816 unmapped: 29130752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.172168732s of 18.204217911s, submitted: 12
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265083 data_alloc: 218103808 data_used: 7229440
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 120430592 unmapped: 27262976 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97b1000/0x0/0x4ffc00000, data 0x19f24de/0x1aab000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278635 data_alloc: 218103808 data_used: 8278016
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119906304 unmapped: 27787264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278635 data_alloc: 218103808 data_used: 8278016
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 27779072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119922688 unmapped: 27770880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6a18c00 session 0x55e0b7af2780
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954920769s of 15.095984459s, submitted: 60
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b4793800 session 0x55e0b7af2960
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4f97a1000/0x0/0x4ffc00000, data 0x1a024de/0x1abb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278503 data_alloc: 218103808 data_used: 8278016
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 119930880 unmapped: 27762688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71800 session 0x55e0b793f860
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117719040 unmapped: 29974528 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117727232 unmapped: 29966336 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 29958144 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 29949952 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 29941760 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 29933568 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117768192 unmapped: 29925376 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117776384 unmapped: 29917184 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117399552 unmapped: 30294016 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf dump' '{prefix=perf dump}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf schema' '{prefix=perf schema}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 11K writes, 3071 syncs, 3.70 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2114 writes, 7094 keys, 2114 commit groups, 1.0 writes per commit group, ingest: 8.13 MB, 0.01 MB/s#012Interval WAL: 2114 writes, 900 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117506048 unmapped: 30187520 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 30179328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 30179328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 30179328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 30179328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 30179328 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117522432 unmapped: 30171136 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117530624 unmapped: 30162944 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117538816 unmapped: 30154752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117538816 unmapped: 30154752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117538816 unmapped: 30154752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117547008 unmapped: 30146560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117547008 unmapped: 30146560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117547008 unmapped: 30146560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117547008 unmapped: 30146560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117547008 unmapped: 30146560 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 30138368 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 30130176 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 30244864 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 30236672 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117465088 unmapped: 30228480 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 30220288 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117481472 unmapped: 30212096 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117489664 unmapped: 30203904 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117497856 unmapped: 30195712 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 260.861328125s of 261.165924072s, submitted: 25
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117538816 unmapped: 30154752 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 30113792 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117710848 unmapped: 29982720 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117784576 unmapped: 29908992 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117841920 unmapped: 29851648 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 29835264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 29835264 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117866496 unmapped: 29827072 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 29818880 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117882880 unmapped: 29810688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117882880 unmapped: 29810688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117882880 unmapped: 29810688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117882880 unmapped: 29810688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117882880 unmapped: 29810688 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117891072 unmapped: 29802496 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 29794304 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 29786112 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 29777920 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 29769728 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 29761536 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 29753344 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 29745152 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117956608 unmapped: 29736960 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117964800 unmapped: 29728768 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117972992 unmapped: 29720576 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117981184 unmapped: 29712384 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/889218059' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117989376 unmapped: 29704192 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118005760 unmapped: 29687808 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 29679616 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 29671424 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 29671424 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 29671424 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 29671424 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118030336 unmapped: 29663232 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b78fcc00 session 0x55e0b7bb5e00
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b5291400 session 0x55e0b38523c0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e71c00 session 0x55e0b6f470e0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118038528 unmapped: 29655040 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 ms_handle_reset con 0x55e0b6e70800 session 0x55e0b6f4c5a0
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 29646848 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118054912 unmapped: 29638656 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118054912 unmapped: 29638656 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118063104 unmapped: 29630464 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118063104 unmapped: 29630464 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118063104 unmapped: 29630464 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118063104 unmapped: 29630464 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118063104 unmapped: 29630464 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118071296 unmapped: 29622272 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118079488 unmapped: 29614080 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118087680 unmapped: 29605888 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118095872 unmapped: 29597696 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 29589504 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fa0ac000/0x0/0x4ffc00000, data 0x10f84ce/0x11b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 118120448 unmapped: 29573120 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: prioritycache tune_memory target: 4294967296 mapped: 117997568 unmapped: 29696000 heap: 147693568 old mem: 2845415833 new mem: 2845415833
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1198046 data_alloc: 218103808 data_used: 6443008
Dec  1 05:37:55 np0005540826 ceph-osd[77525]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:37:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:56.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/618799525' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1579245954' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:37:56 np0005540826 nova_compute[229148]: 2025-12-01 10:37:56.532 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:56 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:56 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:56 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:56.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  1 05:37:56 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2642429074' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 05:37:57 np0005540826 nova_compute[229148]: 2025-12-01 10:37:57.763 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:37:57 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  1 05:37:57 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3710131050' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 05:37:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:37:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:58.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3890453195' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3803315090' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 05:37:58 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:37:58 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:58 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:58.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/681427868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  1 05:37:58 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1076960667' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1979824438' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2407988442' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2250972551' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  1 05:37:59 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1712938499' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 05:38:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:00.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:00 np0005540826 systemd[1]: Starting Hostname Service...
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3718359981' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  1 05:38:00 np0005540826 systemd[1]: Started Hostname Service.
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3677329457' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1420896866' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 05:38:00 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:00 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:00 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:00.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3155100187' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  1 05:38:00 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2397045879' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  1 05:38:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  1 05:38:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3753332438' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  1 05:38:01 np0005540826 nova_compute[229148]: 2025-12-01 10:38:01.534 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:38:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  1 05:38:01 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/385475963' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  1 05:38:01 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:38:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:02.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:02 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  1 05:38:02 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3780487315' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  1 05:38:02 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:02 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:02 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:02.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:02 np0005540826 nova_compute[229148]: 2025-12-01 10:38:02.768 229152 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2136226514' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1068615254' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  1 05:38:03 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3103096758' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  1 05:38:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:38:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:04.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/102627420' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:38:04.573 141685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:38:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:38:04.573 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:38:04 np0005540826 ovn_metadata_agent[141680]: 2025-12-01 10:38:04.574 141685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:38:04 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:04 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:38:04 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:04.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  1 05:38:04 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2976293796' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  1 05:38:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:05 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  1 05:38:05 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4168355367' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  1 05:38:06 np0005540826 ceph-mon[80026]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  1 05:38:06 np0005540826 ceph-mon[80026]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2496165473' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  1 05:38:06 np0005540826 radosgw[83613]: ====== starting new request req=0x7f23e789c5d0 =====
Dec  1 05:38:06 np0005540826 radosgw[83613]: ====== req done req=0x7f23e789c5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:38:06 np0005540826 radosgw[83613]: beast: 0x7f23e789c5d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:06.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
